Communications / History / Society / Technology

Propaganda as Pump Handle (2006)

Communications theorist and propaganda scholar Harold Lasswell famously stated, “Propaganda, as considered as the technique of controlling attitudes by the manipulation of significant symbols, is no more moral or immoral than a pump handle” (1928: 264).  Such a pronouncement might fly in the face of Anglo-American common sense.  People here equate propaganda with lies and deception and are averse to the idea of it being directed at them for any purpose.  It is anything but neutral.  Furthermore, many associate propaganda with Hitler and the Third Reich or the Soviet Union under Stalin.  Mind control and evil intentions come to mind.  But do these assumptions accurately describe the term?

Any scholarly examination of propaganda must look beyond automatic emotional reactions and delve into its broader implications and more varied effects.  While it has been used to promote much evil, one line of argument goes, it is surely capable of good, and therefore neutral.  This is undoubtedly what Lasswell had in mind.  He wanted to approach the subject from that direction in order to better understand it and hopefully limit its negative effects.  But might there be unexamined consequences in taking an analytically value-neutral stance to propaganda studies?  For example, propaganda can be used for good or bad, but these are obviously relative value concepts.  One group’s good is another’s evil.  So ultimately does this really tell us anything?  Almost any use of propaganda would surely be rationalized by the propagandists as the right thing, as proper and good.  Linguist and media critic Noam Chomsky makes precisely this point (1989): “It is probable that the most inhuman monsters, even the Himmlers and the Mengeles, convince themselves that they are engaged in noble and courageous acts” (19).  So simply saying propaganda is neutral and we only need to use it for good fails here because much of what constitutes the “good” is contingent upon any number of factors – preconceptions and ideology, social class, education, political and economic power – that must be examined in order to understand propaganda as a social force actualized in the real world.  Propaganda is embedded with value and largely becomes, due to unequal social relations which exist, a tool for the powerful, who are the principle operators and owners of the channels of mass communications, if not the primary influencers of them.  The value-neutral approach overlooks this social (mal)distribution of power and, in doing so, helps to reinforce and reproduce this reality of inequality.

So what exactly do we mean by propaganda?  Of the countless social scientists, psychologists, philosophers, communications scholars, and many others who study propaganda, one thing is certain: among them, there is no clearly agreed upon definition of the term.  By one author’s count, more than one hundred distinct definitions exist in the relevant literature (Cunningham, 2002: 60).  Yet, despite the differing opinions on the exactness of what it is, there are clear features which emerge.  Randal Marlin (2002) finds that what is common is that propaganda is seen as “an organized and deliberate attempt to influence many people, directly or indirectly” (22, emphasis added).  But, while locating analogous definitional characteristics among the diverse literature, such a description alone is insufficient in distinguishing it from other forms of communication.  So, some scholars go even further, stressing manipulation as well as intentionality.  Lasswell (1927) succinctly calls propaganda “the management of collective attitudes by the manipulation of significant symbols” (627).  Likewise, the great propaganda theorist Jacques Ellul (1973 [1965]) more specifically defines it as “a set of methods employed by an organized group that wants to bring about the active or passive participation in its actions of a mass of individuals, psychologically unified through psychological manipulations and incorporated in an organization” (61).  Stressing these properties – purposive manipulation in an organized manner, the mobilization of mass groups, and a utilitarian-based method
helps to distinguish propaganda from common forms of education and communication, and enlightens our discussion of whether it is best analyzed as a neutral concept.

While we can divide approaches to propaganda into three general categories – negative, neutral, and favorable (Marlin, 2002: 15-23) – the value-neutral approach today is widely accepted and “amounts to orthodoxy” (Cunningham, 2002: 129).  This dominant interpretation, as noted above, is strongly driven by the idea that propaganda can be used for good or bad and is, therefore, contingent upon the context of its use.  Lasswell’s propaganda-as-pump-handle statement represents this line of thought and was one of the earliest statements framing it as thus, encouraging a scholarly analysis of the subject “in a clear-headed, stick-to-the-facts way” (Brown, 2006).  Stanley Cunningham traces the roots of the value-neutral perspective to a moral detachment in approaches to research and analysis:

The neutralist thesis tends to emerge when propaganda, conceived as either psychological effects or, even more impersonally, as detached messages, is routinely distanced from the epistemic and ethical center-points of human communicative action. When propaganda is thus detached from its real center of gravity – the human act in all its epistemic, intentional and moral complexity – it can much more easily be treated as ethically indifferent, as a one-dimensional effect or artifact. (2002: 152)

By looking only at the utility of propaganda – its effectiveness, technical aspects, the messages sent – the researcher, in seeming to take a scientifically objective approach, creates a moral vacuum that can result in unseen consequences.  But is it so easy to detach ethical concerns from the technical application of propaganda, and what, if any, are the dangers in doing so?

Around the same time Lasswell was arguing that propaganda was best studied as a neutral principle or process (1928; 1934), a scholarly shift was also occurring in the field of propaganda analysis.  Critical studies of propaganda in the 1930s gave way in the following decades to effects-based analyses which were heavily influenced by corporate and government desires to manage public opinion (Cunningham, 2002: 182).  Such a move encouraged a more utilitarian approach to the subject, focusing on its properties as a technique and tool for achieving ends and paying less attention to its possible negative features.  Ellul addresses the growing instrumental nature of modern propaganda as follows: “More and more, the propagandist is a technician using a keyboard of material media and psychological techniques” (1973 [1965]: 197).  This is what results from the technocratic tendency to “rationalize” the knowledge of social science and psychology to potentially be used for any desired ends because ethical considerations are not part of the calculus.  Approaching propaganda as a neutral process dissociated from ethics opens the door for a “disinterested” (mis)use.


Robert K. Merton

Robert K. Merton is correct in warning against social scientists, whose research and interests include the study and understanding of public opinion and mass persuasion, from taking “a detached and dispassionate” stance with regards to that study (1945: 271).  Such a position serves to take away an important degree of responsibility for how one’s findings are used.  In short, it is impossible to separate means used from possible ends achieved, which by doing so might very well contribute to the immoral use of propaganda.  “The investigator may naïvely suppose that he [sic] is engaged in the value-free activity of research, whereas in fact he may simply have so defined his research problems that the results will be of use to one group in the society, and not to others.  His very choice and definition of a problem reflect his tacit values” (Merton, 1945: 271).  Merton’s warning brings up a crucial issue.  It forces us to look at the social context in which propaganda actually takes place, to look at society’s institutional structure, as well as the embedded values of the researchers and propagandists, and to consider how these elements shape propaganda and its study.  Such an insight raises important questions: Who defines the terms of propaganda when it is used?  Is propaganda socially neutral in that all groups and persons have equal access to its powers and benefits?  Are there built-in assumptions in propaganda analysis?  And more broadly, what are the consequences for democracy?


Jürgen Habermas

Some theorists from the early part of the 20th century began to perceive propaganda as a necessary component of the modern, liberal democratic state (see Lippmann, 1965 [1922]; Lasswell, 1928: 261; Bernays, 1928, 1947).  Yet this idea lies in a specific conception of democracy, a conception reflecting certain values and norms – not to mention class interests.  In expanding upon democratic theory, German philosopher Jürgen Habermas developed the idea of the public sphere, which he defines as “a realm of our social life in which something approaching public opinion can be formed.  Access is guaranteed to all citizens” (1974 [1964]: 49). This normative notion proposes an open and dialogical mode of social discourse and rests on a general assumption that individuals, through open channels of communication and free access to information through media, will act in a rational way in developing public opinion, eventually arriving at a common good.  Propaganda, in contrast, is largely one-way, manipulative and non-voluntary.  Habermas’ is a conception of democracy that is incompatible with a “democratic” system that relies on propaganda to mobilize public opinion, and as such provides an important alternative to the propagandist’s democratic vision, a vision that harbors negative stereotypes about common people and is in full accord with elite values and interests.

In presenting his theory of democracy, Lasswell (1928) argues that “[p]ropaganda, if vigorously used on all sides, makes for the maintenance of public interest in political affairs” (263, emphasis added).  Similarly, Edward Bernays, a principle founder of the public relations industry, believes that in contemporary democracies “the privilege of attempting to sway public opinion is everyone’s.  It is one of the manifestations of democracy that anyone may try to convince others and to assume leadership on behalf of his own thesis” (1928: 959).  While theoretically true in concept, such positions assume the existence of a diversity of opinions, each of relatively equal status and ability for communicating to the public.  But is this truly the case in modern mass society?  Is there no essential difference between the efforts of a small, relatively powerless group with a certain message and its attempts to get that message out, heard and entered into the social discourse, and the efforts, say, of a public relations firm working for wealthy corporate interests, or the activities of a powerful government in influencing opinion?  While in the abstract propaganda, as a process, as a means to achieve an end, is arguably value-neutral, the social reality within which it is practiced is anything but a neutral space where any single group’s propaganda has as much a chance to persuade as the next.  Ellul makes this very point when he writes:

[T]he freedom of expression of one who makes a speech to a limited audience is not the same as that of the speaker who has all the radio sets in the country at his disposal, all the more as the science of propaganda gives to these instruments a shock effect that the non-initiated cannot equal. (1973 [1965]: 237)

It is easy to see how, in such a reality of unequal social power, propaganda – while not solely possessed by the powerful – is inclined to be a tool for those people and groups that wield such power, and as such “tends to be far more top-down than it is bottom-up” (Kimble, 2005: 203).

Perhaps the most prominent proponent of the elite use of propaganda was Edward Bernays.  Bernays (1947) openly called for using modern mass communications and scientifically mastered persuasive techniques to organize society in accord with powerful interests, in what he called “the engineering of consent” (114).  He saw mass communications “as a potent force for social good or possible evil” and felt that leadership, “with the aid of technicians,” should work towards furthering their version of the social good (1947: 113, 114).  While Bernays perceived this consent engineering to be “the very essence of the democratic process,” he saw that, due mostly to lack of education, the general public had to be directed towards “socially constructive goals and values” deemed important by the leadership (114).  Inherent in this is the belief that the complexities of modern society are too much for the average person to evaluate and make rational decisions about.  Therefore only a select, highly-educated leadership is able to make such determinations and further the overall social interest and well-being.  But as Leonard Doob points out, “If wisdom concerning facts and values cannot be expected from the masses, it does not follow that only experts have a monopoly on the kind of wisdom required to resolve the great problems of an age” (1948: 206).


Walter Lippmann (1914)

American journalist and social philosopher Walter Lippmann (1965 [1922]) also reflects the elitist conception when he writes, “[T]he common interests very largely elude public opinion entirely, and can be managed only by a specialized class whose personal interests reach beyond the locality” (195).  Lippmann names this as one of the principle reasons for using propaganda as a means of shaping public opinion.  Likewise, according to Lasswell (1934), “[t]he modern propagandist . . . recognizes that men [sic] are often poor judges of their own interests, flitting from one alternative to the next without solid reason or clinging timorously to the fragments of some mossy rock of ages” (24).  Even certain later theorists of propaganda voice similarly elitist ideas.  Terence Qualter (1985) argues that the task of modern government “is so highly technical in nature” that its operation should be kept out of the hands of “inexpert public opinion” (127).  Despite the scientifically neutral sound of such ideas and the confidence with which they are uttered, one can clearly see they are formulated using value judgments, and as such bring into question the very utility of evaluating propaganda as value neutral.  It is their belief the public – due to its own ignorance – needs to be manipulated into accepting the ideologically- and value-laden assumptions of what an elite minority deems to be for the overall social good.  But one cannot assume that what elite sectors of society determine to be the common good will actually be so for other parts of society.  Moreover, a wealthy minority could very well take advantage of its social power and greater command of the communications process to use propaganda to promote policies harmful to the general population yet beneficial to themselves: lower taxes on the wealthy while cutting social programs for the poor, deregulation of business enterprises while jobs are moved overseas, etc.

These theorists want to strip propaganda of ethical considerations, to make it a sterile object of research, which in turn contributes to it being used as a rationalistic, technocratic tool mastered and used by elite, educated managers of society, the people best able to handle the “highly technical” task of government (Qualter, 1985: 127).  But unstated in these assumptions is the fact that any such efforts require the value judgments of the propagandists and necessitate a specific concept of what is best for society, which, as noted above, will rarely diverge from what is best for powerful interests backing the propagandists.  Therefore, propaganda is a tool for spreading and enforcing value in the sense that any social order which it is the job of propaganda to promote, or the ideas of which it is to enforce, are reflections of particular beliefs and social relations.  Propaganda, as actualized in reality and not as an abstract process, will always come embedded with values and ideological assumptions.

Accepting the value-neutral thesis of propaganda allows the analyst to avoid looking at how social power is intrinsically tied to propaganda.  It encourages an abstraction – propaganda as a concept considered without the complexities of social hierarchies and classes – that does not exist in reality.  This, in turn, permits its uncritical incorporation into the prevailing social system as a tool for promoting institutional power.  Propaganda absent social relations is an impossibility in mass society, where public opinions “are actively realized within the prevailing institutions of power” (Mills, 1956: 75).  And it is through such power that contemporary elites control the principle channels of modern communications: the mass media, which are likewise the primary sources of publicly available information and, as such, essential tools of the propagandist.  Within a class society, a society where power is unequally distributed as in the global capitalist system, automatic structural constraints exist which cannot be ignored.  Propaganda, as realized, will never be neutral in actuality because power can dictate the truth, what is considered correct, the very structure of propaganda.

The intent of this essay was not to portray propaganda as all-powerful and evil, a monopolized tool of the elite used to control, in an absolute sense, the public mind.  No, what was intended was an attempt to demonstrate the difficulty, and danger, of separating such “tools” from any ethical center, and more importantly, from the social, material reality within which the tools of propaganda are used, a reality in which existing power relations are justified as right and good.  As shown, many of the theorists who see propaganda as neutral clearly internalize certain values and justify propaganda as a means of enforcing such values.  Alternatives are rarely if ever addressed.  Undoubtedly propaganda can be used for peaceful purposes and for promoting the overall common good.  But is it a necessary condition for achieving such goals?  Might other options be available?  The public sphere as proposed by Habermas points to a possible alternative.  It offers a substitute to the largely one-way, manipulative flow of information as represented by propaganda.  It is a vision of a participatory democracy, not a managed one.  Therefore, we can envision peace through such a mechanism without needing to resort to propaganda, but can we say the same for war?  At least where advanced industrialized societies are concerned, war and propaganda go hand-in-hand.  From prior to World War I to the current Global War on Terrorism, propaganda has been seen as an essential component of military operations.  Propaganda, even when approached as a neutral concept, will always reflect social relations when applied.

Propagandists and propaganda scholars, despite their best intentions, harbor certain values which will inform their approach to the subject and affect their conclusions.  This is unavoidable and we need not lose sight of the fact.  So in this, propaganda is infused with value.  It will most certainly, in actuality, be a purveyor of values.  And it is best seen as being so because a critical analytical approach allows a deeper, more sustained understanding of the current social system and how mass media and communications technologies reflect often unconsidered disparities of power.  The neutrality thesis ignores this and subsequently opens the door for abuse.



Bernays, Edward (1947): The Engineering of Consent.  Annals of the American Academy of Political Social Science, 250, March, 113-120.

Bernays, Edward (1928): Manipulating Public Opinion: The Why and The How.  The American Journal of Sociology, 33:6, 958-971.

Brown, John (2006): Two Ways of Looking at Propaganda.  [On-line]  <; (Accessed 9 Nov. 2006).

Chomsky, Noam (1989): Necessary Illusions: Thought Control in Democratic Societies.   Boston: South End Press.

Cunningham, Stanley B. (2002): The Idea of Propaganda: A Reconstruction.  London: Praeger.

Doob, Leonard William (1948): Public Opinion and Propaganda.  New York: Henry Holt and Company.

Ellul, Jacques ([1965] 1973): Propaganda: The Formation of Men’s Attitudes.  Translated by Konrad Kellen and Jean Lerner.  New York: Vintage.

Habermas, Jürgen (1974 [1964]): The Public Sphere: An Encyclopedia Article.  New German Critique, No. 3 (Autumn, 1974), 49-55.  Translated by Sara Lennox and Frank Lennox.

Kimble, James J. (2005): Whither Propaganda? Agonism and ‘‘The Engineering of Consent’’.   Quarterly Journal of Speech, 91:2, 201-218.

Lasswell, Harold D. (1928): The Function of the Propagandist. International Journal of Ethics, 38:3, 258-278.

Lasswell, Harold D. (1934): Propaganda.  In Propaganda, edited by Robert Jackall.  London: Macmillan Press, 1995.

Lasswell, Harold D. (1927): The Theory of Political Propaganda.  The American Political Science Review, 21:3, 627-631.

Lippmann, Walter (1965 [1922]): Public Opinion.  New York : Free Press ; London : Collier Macmillan.

Marlin, Randal. (2002): Propaganda and the Ethics of Persuasion.  Peterborough: Broadview Press.

Merton, Robert K. (1945): Mass Persuasion: A Technical Problem and a Moral Dilemma.  In Propaganda, edited by Robert Jackall.  London: Macmillan Press, 1995.

Mills, C. Wright. (1956): The Mass Society.  In Propaganda, edited by Robert Jackall.  London: Macmillan Press, 1995.

Qualter, Terence H. (1985): Opinion Control in the Democracies.  London: Macmillan.




Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s