An experimenter divided his subjects into two groups and gave them the same difficult task. Each group had to look through a microscope and discriminate which of two types of cells they were looking at. Neither group had any prior knowledge of how to do this.
The group in the first room received accurate feedback on their performance, and after making a lot of wrong guesses, gradually began to improve. When they reached a point where they started to get the answers right, the experimenter stopped them.
The group in the second room received purely random feedback. The professor would flip a coin and tell them if they were right or wrong depending on whether "heads" or "tails" came up. Not surprisingly, this group's performance did not improve. However, they developed extremely complicated and elaborate theories to justify their guesses.
The instructor then brought the groups together to discuss their conclusions. What happened? The second group, whose answers were complicated but wrong, convinced the first group, whose answers were simple but mostly correct, that they were mistaken. The first group became seduced by the more complex (if fanciful and incorrect) view.
If the two groups had been studying investing instead of looking at slides, the outcome would have been the same. What few good investment ideas are out there would be sidelined by the many terrible ideas crowding the field.
When you mix our basic human insecurity ...