To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
Languages
Recent
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

From Wikipedia, the free encyclopedia

John David Sterman
Alma materMIT Ph.D, 1982
Dartmouth College
Known forBusiness Dynamics: Systems thinking and modeling for a complex world
AwardsJay W. Forrester Prize
Scientific career
FieldsSystems science
InstitutionsMIT, New England Complex Systems Institute

John David Sterman is the Jay W. Forrester Professor of Management, and the current director of the MIT System Dynamics Group at the MIT Sloan School of Management.[1][2] He is also co-faculty at the New England Complex Systems Institute. He is mostly considered as the current leader of the System Dynamics school of thought. He is the author of Business Dynamics: Systems Thinking and Modeling for a Complex World.

Prof. Sterman has twice been awarded the Jay W. Forrester Prize for the best published work in system dynamics,[3] won an IBM Faculty Award, won the Accenture Award for the best paper of the year published in the California Management Review, has seven times won awards for teaching excellence, and was named one of the MIT Sloan School's "Outstanding Faculty" by the Business Week Guide to the Best Business Schools. He has been featured on public television's News Hour, National Public Radio's Marketplace, CBC television, Fortune, the Financial Times, Business Week, and other media for his research and innovative use of interactive simulations in management education and policymaking.

He was an undergraduate at Dartmouth College and received his Ph.D. from the MIT Sloan School of Management in 1982. [1]

His research focuses on improving managerial decision making in complex systems. He has pioneered so-called "management flight simulators" used for learning to manage the complexity of corporate and economic systems.

YouTube Encyclopedic

  • 1/3
    Views:
    75 031
    3 321
    556
  • Introduction to System Dynamics: Overview
  • John Sterman on System Dynamics
  • Sloan Alumni Online: John Sterman, PhD '82

Transcription

PROFESSOR: This is a full semester class divided into two half semester pieces. 15.871 is the six unit H-1 class. That's what probably most of you are thinking about doing. What I would urge you to do is to take both halves, because you and I will co-teach the fall semester. The second half is called 15.872. And although we think this first half is good, the real value in terms of developing a meaningful capability for you to be able to use system dynamics and systems thinking effectively will come if you take the full semester. You don't have to exercise that real option today. You can wait make that decision a little later. But think about it for right now for your semester planning. There are five assignments over the course of this semester. So the good news is there are no exams. The bad news is that there are five assignments and this class has a reputation that I believe is reasonably well deserved for having a heavy workload. The reason for that is clear. I can't teach you anything. I know you're giggling over there. But it's absolutely true. I can't teach you anything. All that I can do-- all that we can do is create an opportunity for you to learn for yourself. You have to do it, try it, practice it, if you're going to develop the capability. That is not going to happen by coming to class. Now, you need to come to class, and participation is part of the grade. But that's not sufficient. It's also necessary for you to try everything out. And that is why we have those assignments. You also are going to need to read the textbooks. So this is the book. And it's pretty fat and heavy. You're not going to be asked to read the whole thing. The bad news on that is I wrote it. So I can tell you that it'll definitely cure your insomnia. I don't care if you buy it. Although I always say, if you buy two, then you can kind of go like this all day, and you can save money by quitting your gym membership. But I don't care if you buy it. I do care that you read it. You can't do well on the assignments unless you read the material in the book carefully, and work through some of those examples. Chapter one is what you need to be reading first. The syllabus tells you what to read for every day. In addition, from time to time, we are asking you to read a couple of short case studies or other material. And we'll provide follow up articles from the professional literature from time to time as well. So the question is why? Why do we need something like system dynamics and systems thinking? And I think the answer is not simply that the world is changing faster and faster. Things are accelerating. Everybody knows that. That's kind of the price of admission to the world today. It's that, despite all the tools and methods that we've got, all the analytic power and our cleverness, things are getting harder and harder. And more and more of the policies that we implement are failing to solve the pressing challenges that we face. And this is not what I'm saying this is what the senior leaders and organizations with whom I work, what they tell me. And the thoughtful ones-- the most thoughtful ones-- they say it's not just the things are getting harder and more difficult, despite our cleverness and our analytic power, but because of it. That we're too clever for our own good. And I illustrate this traditionally with this picture of one of the leaders of an organization that I've worked with. And here is the poor guy in this office. Now many of you have seen this before, but I think it's just a great representation of what's going on. Like most senior managers or lower level managers, he is completely squeeze by pressures on all sides. Can't breathe. Claustrophobic. And what you're asked to do as a manager is to be decisive. You've got to make decisions. Boo. Things are now much better for you. Now you can begin to breathe more easily, see out to the side. Relief. Things are great. But as you may suspect, there could be some unanticipated side effects. Now, the reason I like this and the reason I'm showing it to you again, so I think it's a great way to capture the core of what a lot of systems thinking is about. Why does this happen? Why does this happen? And why don't people learn? So it's not just that it happens once, but people do it over and over again. So why? That's a real question for you. So what do think? Why might this happen? And why might this phenomenon persist? Yeah go ahead. AUDIENCE: It's only in the short term, and the short-term implications of that action. PROFESSOR: Great. So short-term time rise and not thinking about what might happen later. So apres-moi le deluge. I don't care. I'll do what's good for me in the short run. I don't care if it destroys the world later. OK great. What else? Yeah go ahead. AUDIENCE: Feedback loops. So you would get some feedback and you won't really consider it the way that it really is. PROFESSOR: So there's definitely a feedback loop here, right? And the problem is that it takes too much time for that to happen. So the time delay in getting the feedback-- maybe missing feedback-- is connected to the short time horizon. In fact, what's going to happen in most organizations to this manager right about now? He solved the problem, right? So what happens to him. AUDIENCE: He gets promoted. PROFESSOR: Of course. He gets promoted. You get to sit in the chair. It's a combination of short time horizon, and not only by that guy, but the people who are evaluating his performance and maybe encouraging him to have a short time horizon. That's not going to be good for anybody in this situation. What else? What else might be going on? We get somebody over on this side? [? Argoff, ?] what do you think? AUDIENCE: I think it's also, like you said it was, when you're thinking only short term, you find that can you make a decision without thinking about what the impact of-- PROFESSOR: This is a really important point. And let me put it into our terms. When people say, it wasn't my fault, the reason we failed was some outside effect, some unanticipated side effect, something that came from out there. It wasn't my fault. They're trying to persuade you that they, in fact, are great managers and shouldn't be held responsible for the bad outcome. In fact, almost all the time, it is at least partially the result of their own past decisions. Feedback they didn't understand and didn't recognize. And what they're actually trying to do is persuade you that it wasn't their fault. But what they're really communicating is how narrow and blinkered and inadequate is their understanding of the system in which they're embedded, what we would call their mental model. I think that's where you were going. So that's a really important idea in what we're going to be about. So one of the mental models that I think is the most damaging out there is this open loop mental model. And here's the question. In a project that you've been involved in or that you were reviewing, how many times have you seen this picture? Can I see hands? Who's seen it? Almost everybody. Who's drawn it in one of their project proposals? About 3/4 of the same hands. This is a very interesting, unintended revelation of mental models that people hold about complex dynamical systems. And it basically says, there's a beginning and middle and end to the project. We're going to identify the issue. We're going to gather the data, evaluate our choices, select the optimal solution, and then implement. And of course, the students in the last class when I said, what's wrong with this picture? They all said, oh well. You know. You're a professor. So you've never implemented anything in your life. OK. Now, in fact, a lot of my research is devoted to the question of why implementation so often fails. We're going to talk about that. System dynamics isn't useful unless you can actually make things different. If you can't catalyze change in your organizations in which you're engaged, none of this is meaningful. You might as well become a professor. So we're going to talk a lot about-- and I mean that in all the negative senses that you're laughing about. We're going to talk about system dynamics in action, especially in the second half of the class. You're going to see a lot of case studies of how people have been able to use these tools effectively in difficult political organizational settings. So that's an issue. But it's not the real problem. The real problem with this is that it has this open loop one-way sequential perspective that says there's a beginning, middle and end to the project. I don't know about you, but no project I ever been involved in has ever gone that way. There's always iteration, feedback. We have to go back to the beginning, almost always unintended, unplanned iteration, because we go through and we find out as we gather the data, we interview the folks that are engaged. And we evaluate our technologies, supply chain for the new product or whatever it is. We really didn't understand the situation. We really didn't understand what the real problem is. We have to loop back to the beginning. This happens continually all the way through the project. And it's that feedback that's critical here. So this is the metaphor that I want you to fix in your mind. You make decisions. Your decisions change the world. And then that creates new information which changes your next decision in a continual, emergent, iterative set of feedback processes. Now let me make this a little more formal. Again, this is a slight review for those of you who have had me in orientation. But it's worth it, especially if you haven't seen this for awhile. So here's that open loop perspective. People say, I know my goals. I want better market share. I want more profitability. I want a bigger house, or a nicer car, whatever it is. That's going to motivate my decisions. And then my decisions are going to change the state of the world, state of the system, problem solved. That's wrong. I can't know what decisions to make just because I know where I want to be. I have to also know where I am right now. There has to be a feedback. So the example I always give is that, I'm a bicycle commuter, and as I was riding into MIT from Lexington this morning, as every day, I must keep my bike on the right hand side of the path. If I don't, I'm going to have a crash. So just knowing that I need to be on the right hand side is not enough for me to know how to turn the handlebars. I have to have the feedback from where my bicycle actually is in order to know which way to turn the handlebars. It is the same for you, driving your car or flying an aircraft. Now if flying your organization through hostile skies, dog fighting with the competition, keeping investors happy and calm in the back, and serving them nice drinks and hors d'oeuvres-- if flying your company was just as easy as flying an aircraft, which isn't that easy by the was. It doesn't take much bad weather, fatigue, or substances in your bloodstream to degrade your abilities so much that you're going to crash the plane, crash your car, or crash your bicycle. But if it was as easy to run your company as it is to ride a bike, no problem. We wouldn't need this class. But it's not. And it's not easy in part, because that feedback loop, which represents the intended effects of your decisions, but doesn't capture the unintended effects. It's only piece of the system. So you're embedded in a much more intricate complex system in which that's what's going on for you. That represents mental models of what you ought to do. But mental models are limited. And all the impacts of your decisions-- all the effects of your decisions that you didn't think about in advance, and that aren't part of your mental model, they're going to manifest as so-called side effects. Remember there's no such thing as a side effect. There's just the effects that are in your mental model that you were counting on and everything else is going to manifest as a so-called side effect. And they're usually going to feedback in a way that's opposite to your goals. Much more interesting, you're not the only player in the world. So there's all the other actors out there, all the other agents out there. And they have their own goals, which are typically different from your goals. You want more market share, so do they. There is only 100% to divide up. And every time you make decisions, even if they're efficacious that pulled the world closer to your goals, they're necessarily going to be pulling the world farther away from those other folks goals-- your customers, your suppliers, your employees, the investors, the competitors, the communities in which you operate, the natural world in which all of that is embedded. That's all going to be there. And those goals are going to motivate them to take action, to try to bring the state of the world back to what they want it to be. Their mental models are limited too. And so they're going to generate unintended so-called side effects. And now this is getting to be a fairly complex thing to manage-- not as easy as riding a bicycle. The whole story here is about expanding the boundaries of your mental models so that more and more of this structure is something that you can begin to think about and try to take into account when you make decisions. You're never going to get it all, because all models are wrong. Model is not the real system. Only the reality is the reality. And everything in your head is a limited, filtered, imperfect representation. But we can do a lot better than the mental models that we have now. So what are we going to do? What we're going to do is develop tools in this class to elicit your mental models, articulate them, and do that in the context of busy people in organizations. We're going to explicitly account for feedback, and stocks and foils, and time delays, and non-linearities, and the other elements of complex dynamical systems. And then we're going to use simulation to figure out what that means. Not because the model is going to give us the answer-- all models are wrong-- because the simulation models are going to give us insight that improves our mental models and the mental models of all the people who need to be involved in order for change to happen, so that people are empowered with high leverage, effective policies to go out there and make a difference. You can read in the syllabus how we're going to do that. But I think there are three core ideas I'd like to leave you with before we break for today. The first is that it's the structure of complex systems that generates their behavior. That structure consists of the physics of the system, the information that's available to you, and then the decision rules that you used to turn that information into action. All three of those are relevant here. Mental models matter a lot. It is not enough just to come up with the right answer. And it's not enough just to change the physics of the system, or the information with a new IT system, or the incentives that people face. All those are important, but they aren't generally sufficient. And one of the very powerful mental models that's out there is what we call the fundamental attribution error in psychology. And this is an idea you should have learned at the beer game. And it's the idea that if you ask me why I've screwed up, I've got reasons known as excuses. It was the customer's fault. It was somebody else's fault. The sun was in my eyes. But if I'm asked to explain why use screwed up, It's because you're not capable. You don't have what it takes, you and everybody like you. And that is almost always wrong. And it's low leverage. It doesn't help. So to put that into practice in this class, when we come in here, and you work with us this semester, we're going to make the following basic assumption. We believe that everybody in this room is intelligent, is capable, cares about doing their best, and wants to learn.

Publications

John Sterman has written a few books and several articles. A selection:

Articles:

References

  1. ^ E. Cabell Brand (2010). If not me, then who? : how you can help with poverty, economic opportunity, education,... [S.l.]: Iuniverse Inc. p. 68. ISBN 978-1-936236-12-1. Retrieved 26 January 2011.
  2. ^ Walsh, Bryan (28 October 2008). "What the Public Doesn't Get About Climate Change". Time. Retrieved 13 April 2021.
  3. ^ M, Arnaud (2021-03-11). "World2 model, from DYNAMO to R". Medium. Retrieved 2022-02-21.

External links

This page was last edited on 18 December 2023, at 12:27
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.