- Anchoring – the tendency to rely too heavily, or "anchor," on a past reference or on one trait or piece of information when making decisions (also called "insufficient adjustment").
- Attentional Bias – the tendency of emotionally dominant stimuli in one's environment to preferentially draw and hold attention and to neglect relevant data when making judgments of a correlation or association.
- Confirmation bias – the tendency to search for or interpret information in a way that confirms one's preconceptions.
- Congruence bias – the tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.
- Knowledge bias – the tendency of people to choose the option they know best rather than the best option.
There are hundreds of ways that your thought processes can go awry.
This is why peer-review and consensus building (where everyone agrees the outliers are being unreasonable and why; not that the majority rules the day) is so critically important in the sciences - scientists are in no way immune to any of these problems. Anyone who puts most of their life's work into something is very likely to be blind to many of their own biases on the subject. After all, they didn't get there by accident, they are very carefully studied and rationalized biases. But when we put hundreds or thousands of minds together on a question we can cover our blind spots so much better than just one mind working alone.
We should not imagine that 'morality' (or our sense of Right or Wrong) should be any different. We learn about morality the exact same way we learn about the world... We observe, we build models, we predict, we select desirable outcomes (sometimes altruistically and sometimes selfishly), we act, we measure the results against our model, and we try again.
This process is GOING to result in biases, you are working from limited information, limited experiences and driven by emotional currents that we don't directly control. We NEED the feedback loop from other minds in order to do better.
This is why I find the Golden Rule to be a complete failure as a moral guide - it puts far too much stock in the individual and ignores the negotiation, the feedback, the consensus building that is critically vital for any moral system. The Golden Rule is nothing but a nice platitude because it assumes that the other guy will think exactly as you do. I'm not saying the Golden Rule is BAD, but rather that it is, in no way, sufficient.
How we interact with one another is extremely important, not just to us individually but to how we're going to intellectually evolve as a species.
This means that Humans desperately need to sit around around and talk to each other about what is Right and what is Wrong. But we we need to do it in a context where we're not presuming that 2000 year old morality is reasonable to continue to promulgate. We know more about human behavior, sexuality, and disease now than they did - we can make better decisions than they did.
The advantage of secular morality over religious morality is that we can admit our ignorance. We don’t KNOW the perfect answers – so we can engage and discover and learn and be willing to be wrong so that we can improve. If you freeze your morality based on ancient superstitious beliefs then you freeze out this process of improvement based on increasing knowledge.
We have also learned some things about Morality itself. We know that it's wrong for Majority or Might alone to rule, Minority opinions must have a respected voice and perspective.
Cognitive neuroscience has also begun to reveal important information about how our brains work and, perhaps more importantly, how they fail.
One study looks at how our brains process moral questions differently depending upon the immediacy of the situation (The Theory of Moral Neuroscience). To me, this is a moral bias and failure that is wholly unjustified in the modern era. We need to work to expand our empathy to a greater tribe. I think that many people feel this way already.
Cognitive Neuroscience and the Structure of the Moral Mind (and lots more info from Joshua D. Greene) looks at many of the studies that are peeling back the layers of our brain and peering into how the Mind really works.
With new, modern tools for studying the brain, such a fMRI, the Cognitive Neurosciences are exploding. I suspect many of common sense notions will be soundly refuted but at the end of the day, we still need to find better ways to treat each other like the best human beings are known for, and not our worst.
Google: Neuroscience Morality