PDF | 5 minutes read | On Aug 1, , Walter Krämer and others published Kahneman, D. (): Thinking, Fast and Slow. interested to know more about his work, read the book: Thinking, Fast and Slow. and fast thinking, System 2 is cautious and logical and is thinking slow. Editorial Reviews. homeranking.info Review. Amazon Best Books of the Month, November Thinking, Fast and Slow - Kindle edition by Daniel Kahneman .
|Language:||English, Spanish, Indonesian|
|ePub File Size:||28.66 MB|
|PDF File Size:||13.86 MB|
|Distribution:||Free* [*Regsitration Required]|
When you are asked what you are thinking about, you can normally answer. .. The distinction between fast and slow thinking has been explored by. Thinking, Fast and Slow represents an elegant summation of a lifetime of research in which Kahneman, Princeton University Profes- sor Emeritus of Psychology. PDF | Book Review. Understanding how and why we make choices is important for everybody. If you are a scientist or aspire to be one in the future.
System one suggested the incorrect intuition, and System two endorsed it and expressed it in a judgment. This is about letting the past influence present decisions. Kahneman explains that mood is never constant. However, we are also capable to table moments in our life as one or the other. Amazon Music Stream millions of songs. The second part will discuss heuristics and biases before we move onto part three and overconfidence. In both cases, not much.
Kahneman explains that the second system involves thinking that is more complex and more mentally draining. It takes concentration and agency of the person to process the thoughts. Kahneman explains that system two is easier to identify with, it is the conscious self, the version of you that makes decisions, makes choices, has reason and beliefs. In order to monitor your own behaviour in certain situations, or to increase your natural walking pace you would be tapping into system two.
It may seem that system two is the dominant system, but Kahneman explains that system one is actually the hero here. Being such a rapid process, system one will inevitably run into problems from time to time, and in this situation, system two will step in for support. Sometimes the situation will call for more detailed processing than system one can account for. When system one simply cannot provide an answer, system two will step up to the plate. Kahneman explains that system two is designed to monitor the thoughts and actions that system one promotes.
Not only will it monitor these, but it will also control these by encouraging, suppressing or modifying behaviours. Kahneman explains that when it comes to doubt, the systems differ. System one is not capable of experiencing doubt. Whereas, system two has the capacity to experience doubt. This is because system two often promotes two options simultaneously that are not compatible.
Each system produces different anchoring effects: And a priming effect, an automatic manifestation of System one. Kahneman explains that the relationship between the two systems can be seen in the effects of random anchors.
He explains that usually, the study of anchoring effects has been based on judgement and choice, the characteristics of system two. However, Kahneman explains that the data that system two uses, derives from memory, the automation from system one. Kahneman explains that it also means that system two is more likely to buy into the bias of anchors encouraging some information to be accessed more easily than others.
Kahneman acknowledges the difficulty of trying to avoid biases. But he also emphasises the importance of doing so in order to remove the risk of mistakes. Kahneman believes that the concept of risk is a human invention. They designed it to assist them with the navigation of dangers, fears, and uncertainties.
System one suggested the incorrect intuition, and System two endorsed it and expressed it in a judgment. Kahneman explains that system two is capable of failure, and there are two reasons why this happens; ignorance and laziness.
System one is capable of making what Kahneman refers to as extreme predictions. System one is prone to using not a whole lot of evidence to make sometimes irrational predictions. He explains that this is due to the nature of system one being more likely to jump to overconfident conclusions, without enough evidence. System two has problems with regression. We, humans, are constantly fooling ourselves by constructing flimsy accounts of the past and believing they are true.
Kahneman explains that the illusion we are faced with is that we are predisposed to assume that we understand past. And by understanding the past, we believe that we can also know what the future might hold. He explains that the key to understanding and being able to comprehend a future is to adjust the language we use in relation to any of our past beliefs. Language is so important as it shapes our reality so it is essential that we use the right languages when thinking about the past.
It leads observers to assess the quality of a decision not by whether the process was sound but by whether its outcome was good or bad. Kahneman explains that the people who struggle the most with hindsight, are those who act as agents over others and those that are decision makers, such as doctors, financial advisors, social workers or politicians.
As well as blaming them all too easily, we often fail to credit them when the decisions they make do work out, assuming that they are just doing their job. Kahneman explains that another consequence of hindsight is that the risk seekers who take big gambles and make a big win are often celebrated undeservedly.
Regardless of their reckless behaviour, the result was good so they are never punished for their luck. According to system one, the world should be a predictable, straight-forward and easy to understand place. But this is simply an illusion.
The mistake appears obvious, but it is just hindsight. You could not have known in advance. We hold a lot of confidence in our opinions and our judgements.
Evidence simply does not come into it. Kahneman stresses just how crazy this can be, really we have very little knowledge yet have a lot of confidence in our beliefs. In making assumptions and predictions about the future, mistakes are inevitable, but Kahneman explains that we can learn from these mistakes.
The first lesson that we can learn is that the world is entirely unpredictable and error will always be made. Secondly, we can learn that confidence should be trusted as a scale of accuracy, in fact, low confidence will often present more truth. Their decisions make a difference; they are the inventors, the entrepreneurs, the political and military leaders—not average people.
They got to where they are by seeking challenges and taking risks. In most cases, when offered two options, one being a gamble with a value much higher than expected, and one being a sure thing of expected value, most people will pick the sure thing. This is because we crave the security of knowing the outcome and avoiding the risk. Kahneman explains that in some cases if a sure thing is offered with less than expected value, some decision-makers will take this as the option, determined to avoid any potential risk.
Risk-taking of this kind often turns manageable failures into disasters. Kahneman describes loss aversion as the motive to avoid losses as stronger than the motive to achieve gain. To use money as an example, we are likely to be motivated more to not lose money than we would be to make money.
By setting a goal you never reach, you are making a loss. By exceeding the goal, you are achieving a gain. Even if you don't have the patience for all the chapters don't neglect the intro and conclusion. The TL;DR doesn't suffice but.
This book is in my top 10 most influential of my life; highly recommended especially in tandem with Haidt's "Righteous Mind"; these two highly complementary books form a multidimensional mirror for the human condition. Hardcover Verified Purchase. When you come late to the party, writing the th review, you have a certain freedom to write something as much for your own use as for other readers, confident that the review will be at the bottom of the pile.
Kahneman's thesis is that the human animal is systematically illogical. Not only do we mis-assess situations, but we do so following fairly predictable patterns. Moreover, those patterns are grounded in our primate ancestry. The first observation, giving the title to the book, is that eons of natural selection gave us the ability to make a fast reaction to a novel situation.
Survival depended on it. So, if we hear an unnatural noise in the bushes, our tendency is to run. Thinking slow, applying human logic, we might reflect that it is probably Johnny coming back from the Girl Scout camp across the river bringing cookies, and that running might not be the best idea. However, fast thinking is hardwired.
The first part of the book is dedicated to a description of the two systems, the fast and slow system. Kahneman introduces them in his first chapter as system one and system two. Chapter 2 talks about the human energy budget. Thinking is metabolically expensive; 20 percent of our energy intake goes to the brain.
Moreover, despite what your teenager tells you, dedicating energy to thinking about one thing means that energy is not available for other things. Since slow thinking is expensive, the body is programmed to avoid it. Chapter 3 expands on this notion of the lazy controller. We don't invoke our slow thinking, system two machinery unless it is needed. It is expensive. As an example, try multiplying two two-digit numbers in your head while you are running.
You will inevitably slow down. Kahneman uses the example of multiplying two digit numbers in your head quite frequently. Most readers don't know how to do this. Check out "The Secrets of Mental Math" for techniques. Kahneman and myself being slightly older guys, we probably like to do it just to prove we still can. Whistling past the graveyard - we know full well that mental processes slow down after Chapter 4 - the associative machine - discusses the way the brain is wired to automatically associate words with one another and concepts with one another, and a new experience with a recent experience.
Think of it as the bananas vomit chapter. Will you think of next time you see a banana? Chapter 5 - cognitive ease. We are lazy. We don't solve the right problem, we solve the easy problem. Chapter 6 - norms, surprises, and causes.
A recurrent theme in the book is that although our brains do contain a statistical algorithm, it is not very accurate. It does not understand the normal distribution. We are inclined to expect more regularity than actually exists in the world, and we have poor intuition about the tail ends of the bell curve.
We have little intuition at all about non-Gaussian distributions. Chapter 7 - a machine for jumping to conclusions. He introduces a recurrent example. The bat costs one dollar more than the ball. How much does the ball cost? System one, fast thinking, leaps out with an answer which is wrong. It requires slow thinking to come up with the right answer - and the instinct to distrust your intuition.
Chapter 8 - how judgments happen. Drawing parallels across domains. If Tom was as smart as he is tall, how smart would he be? Chapter 9 - answering an easier question. Some questions have no easy answer. Section 2 - heuristics and biases Chapter 10 - the law of small numbers.
In the realm of statistics there is a law of large numbers. The larger the sample size, the more accurate the statistical inference from measuring them. Conversely, a small sample size can be quite biased. I was in a study abroad program with 10 women, three of them over six feet.
Could I generalize about the women in the University of Maryland student body? Conversely, I was the only male among 11 students and the only one over Could they generalize anything from that?
In both cases, not much. Chapter 11 - anchors. A irrelevant notion is a hard thing to get rid of. For instance, the asking price of the house should have nothing to do with its value, but it does greatly influence bids. Chapter 12 - the science of availability. If examples come easily to mind, we are more inclined to believe the statistic.
If I know somebody who got mugged last year, and you don't, my assessment of the rate of street crime will probably be too high, and yours perhaps too low. Newspaper headlines distort all of our thinking about the probabilities of things like in and terrorist attacks.
Because we read about it, it is available. Chapter 13 - availability, emotion and risk. Chapter 14 - Tom W's specialty. This is about the tendency for stereotypes to override statistics. If half the students in the University area education majors, and only a 10th of a percent study mortuary science, the odds are overwhelming that any individual student is an education major. Nonetheless, if you ask about Tom W, a sallow gloomy type of guy, people will ignore the statistics and guess he is in mortuary science.
Chapter 15 - less is more. Linda is described as a very intelligent and assertive woman. What are the odds she is a business major?
The odds that she is a feminist business major? Despite the mathematical impossibility, most people will think that the odds of the latter are greater than the former. Chapter 16 - causes trump statistics. The most important aspect of this chapter is Bayesian analysis, which is so much second nature to Kahneman that he doesn't even describe it. The example he gives is a useful illustration. First, to go to the point. Given these numbers, most people will assume that the cab in the accident was blue because of the witness testimony.
The problems are mathematically identical but the opinion is different. Now the surprise. Here's how we figure it out from Bayes theorem. The chances she was right are.
Recommend that you cut and paste this, because Bayes theorem is cited fairly often, and is kind of hard to understand. It may be simple for Kahneman, but it is not for his average reader, I am sure.
Chapter 17 - regression to the mean. The average is only around The chances are little bit of both, and if I take a test a second time I will get a lower score, not because I am any stupider but because your first observation of me wasn't exactly accurate. This is called regression to the mean.
It is not about the things you are measuring, it is about the nature of measurement instruments. Don't mistake luck for talent. Chapter 18 - taming intuitive predictions.
The probability of the occurrence of an event which depends on a number of prior events is the cumulative probability of all those prior events. The probability of a smart grade school kid becoming a Rhodes scholar is a cumulative probability of passing a whole series of hurdles: The message in this chapter is that we tend to overestimate our ability to project the future.
Part three - overconfidence Chapter 19 - the illusion of understanding. We make judgments on the basis of the knowledge we have, and we are overconfident about the predictive value of that observation. To repeat their example, we see the tremendous success of Google. We discount the many perils which could have totally derailed the company along the way, including the venture capitalist who could have bought it all for one million dollars but thought the price was too steep.
Chapter 20 - The illusion of validity. Kahneman once again anticipates a bit more statistical knowledge than his readers are likely to have. The validity of a measure is the degree to which an instrument measures what it purports to measure.
You could ask a question such as whether the SAT is a valid measure of intelligence. The answer is, not really, because performance on the SAT depends quite a bit on prior education and previous exposure to standardized tests. You could ask whether the SAT is a valid predictor of performance in college.
The answer there is that it is not very good, but nonetheless it is the best available predictor.
It is valid enough because there is nothing better. To get back to the point, we are inclined to assume measurements are more valid than they are, in other words, to overestimate our ability to predict based on measurements. Chapter 21 - intuitions versus formulas.
The key anecdote here is about a formula for predicting the quality of a French wine vintage. The rule of thumb formula beat the best French wine experts. Likewise, mathematical algorithms for predicting college success are as least as successful, and much cheaper, than long interviews with placement specialists. Chapter 22 - expert intuition, when can we trust it? The short answer to this is, in situations in which prior experience is quite germane to new situations and there is some degree of predictability, and also an environment which provides feedback so that the experts can validate their predictions.
He would trust the expert intuition of a firefighter; there is some similarity among fires, and the firemen learns quickly about his mistakes.
He would not trust the intuition of a psychiatrist, whose mistakes may not show up for years. Chapter 23 - the outside view.
The key notion here is that people within an institution, project, or any endeavor tend to let their inside knowledge blind them to things an outsider might see. We can be sure that most insiders in Enron foresaw nothing but success. An outsider, having seen more cases of off-balance-sheet accounting and the woes it can cause, would have had a different prediction.
Chapter 24 - the engine of capitalism. This is a tour of decision-making within the capitalist citadel. It should destroy the notion that there are CEOs who are vastly above average, and also the efficient markets theory. The guys in charge often don't understand, and more important, they are blind to their own lack of knowledge. Part four - choices This is a series of chapters about how people make decisions involving money and risk.
In most of the examples presented there is a financially optimal alternative. Many people will not find that alternative because of the way the problem is cast and because of the exogenous factors. Those factors include: Marginal utility. Another thousand dollars is much less important to a millionaire than a wage slave. Chapter 26 - Prospect theory: The bias against loss.
Chapter 27 - The endowment effect. I will not pay as much to acquire something as I would demand if I already owned it and were selling. Chapter 28 - Bad Events.