Deeper cultural assumptions about human nature

Every culture has shared assumptions about what it means to be human, what our basic instincts are, and what kinds of behavior are considered inhuman and therefore grounds for ejection from the group. Being human is not just a physical property but also a cultural construction, as we have seen throughout history. Slavery was often justified by defining slaves as “not human.” In ethnic and religious conflicts the “other” is often defined as not human. Within the category of those defined as human, we have further variation. In their comparative study, Kluckhohn and Strodtbeck (1961) noted that in some societies humans are seen as basically evil, in others as basically good, and in still others as mixed or neutral, capable of being either good or bad. Closely related are assumptions about how per­fectible human nature is. Is our goodness or badness intrinsic so we simply accept what we are, or can we, through hard work, generosity, or faith, overcome our badness and earn our salvation or nirvana? Where a given macroculture ends up in terms of these categories is often related to the religion that dominates that cultural unit, but, as we shall see, this issue is very much at the heart of leadership.

At the organizational level, the basic assumptions about the nature of human nature are often expressed most clearly in how workers and manag­ers are viewed. Within the Western tradition, we have seen an evolution of assumptions about human nature, as follows:

  1. Humans as rational-economic actors
  2. Humans as social animals with primarily social needs
  3. Humans as problem solvers and self-actualizers, with primary needs to be challenged and to use their talents
  4. Humans as complex and malleable (Schein, 1965/1980)

Early theories of employee motivation were almost completely domi­nated by the assumption that the only incentives available to managers are monetary ones because it was assumed that the only essential moti­vation of employees was economic self-interest. The Hawthorne studies (Roethlisberger and Dickson, 1939; Homans, 1950) launched a new series of “social” assumptions, postulating that employees are motivated by the need to relate well to their peer and membership groups and that such motivation often overrides economic self-interest. The main evidence for these assumptions came from studies of restriction of output, which showed clearly that workers would reduce their take-home pay rather than break the norm of “a fair day’s work for a fair day’s pay.” Furthermore, workers will put pressure on high producers (“rate busters”) to work less hard and make less money to preserve the basic norm of a fair day’s work.

Subsequent studies of work, particularly on the effects of the assembly line, introduced another set of assumptions: employees are self-actualizers who need challenge and interesting work to provide self-confirmation and valid outlets for the full use of their talents (Argyris, 1964). Motivation theorists, such as Maslow (1954), proposed that there is a hierarchy of human needs, and an individual will not observe the “higher” needs until lower ones are satisfied: If the individual is in a survival mode, economic motives will dominate; if survival needs are met, social needs come to the fore; if social needs are met, self-actualization needs become salient.

McGregor (1960) observed that within this broad framework, an impor­tant second layer of assumptions was held by managers vis-a-vis employ­ees. Ineffective managers tended to hold an interlocked set of assumptions that McGregor labeled Theory X. Managers who held these assump­tions believed that people are lazy and must therefore be motivated with economic incentives and be controlled by constant surveillance. In con­trast, effective managers held a different set of assumptions that he labeled Theory Y. These managers assumed that people are basically self-motivated and therefore need to be challenged and channeled, not controlled. McGregor and other researchers saw insufficient financial incentives as “demotivators” but observed that adding financial incentives would not increase motiva­tion. Only challenge and use of a person’s talents could increase motivation (Herzberg, 1968). Whereas Theory X assumes that employees are intrinsically in conflict with their employing organization, Theory Y assumes that it is possible to design organizations that enable employee needs to be congruent with organizational needs.

Most current theories are built on still another set of assumptions, namely, that human nature is complex and malleable and that we cannot make a universal statement about human nature. Instead, we must be pre­pared for human variability. Such variability reflects (1) changes in the life cycle in that motives may change and grow as we mature and (2) changes in social conditions in that we are capable of learning new motives as may be required by new situations. Longitudinal studies of people have shown that with work experience, they develop “career anchors” that begin to guide and constrain the career based on self-perceived competencies, motives, and values (Schein, 1978, 1993, 2006). Such variability makes it essential for organizations to develop some consensus on what their own assumptions are because management strategies and practices reflect those assumptions. Both the incentive and control systems in most organizations are built on assump­tions about human nature, and if those assumptions are not shared by the managers of the organization, inconsistent practices and confusion will result.

McGregor (1960) also noted that because humans are malleable, they often respond adaptively to the assumptions that are held about them. This is particularly a problem in organizations that are run by managers who share a Theory X set of assumptions because the more that employees are controlled and treated as untrustworthy, the more likely they are to behave in terms of those expectations. The cynical Theory X manager then feels vindicated but fails to note that the employee behavior was learned and does not reflect intrinsic human nature. A more extreme version occurs when senior managers with personality problems create organizational pathology within the organization they manage (Kets de Vries and Miller, 1984, 1987; Goldman, 2008).

DEC was one of the most Theory Y driven organizations I have ever encountered. The core assumption in Ciba – Geigy was more difficult to decipher, but there were strong indications that individuals were viewed ultimately as good soldiers, who would perform responsibly and loyally, and whose loyalty the organization would reward. Individuals were expected to do their best in whatever was asked of them, but loyalty was ultimately assumed to be more important than individual creativity. It seems that in DEC the individual was ultimately more important than the organization and that in Ciba-Geigy the organization was ultimately more important than the individual.

Source: Schein Edgar H. (2010), Organizational Culture and Leadership, Jossey-Bass; 4th edition.

Leave a Reply

Your email address will not be published.