The moralistic fallacy is the informal fallacy of assuming that whichever aspect of nature which has socially unpleasant consequences cannot exist. Its typical form is "if X were true, then it would happen that Z!", where Z is a morally, socially or politically undesirable thing. What should be moral is assumed a priori to also be naturally occurring. The moralistic fallacy is used to be presented as the reverse of the naturalistic fallacy. However, it could be seen as a variation of the very same naturalistic fallacy; the difference between them could be considered pragmatical, depending on the intentions of that who uses it: naturalistic fallacy if he wants to justify existing social practices with the argument that they are natural; moralistic fallacy if he wants to combat existing social practices with the argument of denying that they are natural.
Steven Pinker writes that "[t]he naturalistic fallacy is the idea that what is found in nature is good. It was the basis for social Darwinism, the belief that helping the poor and sick would get in the way of evolution, which depends on the survival of the fittest. Today, biologists denounce the naturalistic fallacy because they want to describe the natural world honestly, without people deriving morals about how we ought to behave (as in: If birds and beasts engage in adultery, infanticide, cannibalism, it must be OK)." Pinker goes on to explain that "[t]he moralistic fallacy is that what is good is found in nature. It lies behind the bad science in nature-documentary voiceovers: lions are mercy-killers of the weak and sick, mice feel no pain when cats eat them, dung beetles recycle dung to benefit the ecosystem and so on. It also lies behind the romantic belief that humans cannot harbor desires to kill, rape, lie, or steal because that would be too depressing or reactionary."
Moralistic fallacy:
Naturalistic fallacy:
Sometimes basic scientific findings or interpretations are rejected, or their discovery or development or acknowledgement is opposed or restricted, through assertions of potential misuse or harmfulness.
In the late 1970s, Bernard Davis, in response to growing political and public calls to restrict basic research (versus applied research), amid criticisms of dangerous knowledge (versus dangerous applications), applied the term moralistic fallacy toward its present use.