Ethics in Scientific Research

PROCEDURAL MISCONDUCT BY SCIENTISTS: PREVENTION AND REMEDIES

By Michael E. Marotta

PHYS 406: Ethical Issues in Physics; Dr. Patrick L. Koehn; Eastern Michigan University; Winter 2010.

Recent news about procedural misconduct by scientists is easy to find. However, it is a cliché that the plural of anecdote is not data. A full, formal investigation would gather ethnographic studies, carry out statistical inquiries, create one or more taxonomies, test those against other new data, and offer one or more falsifiable assertions about the causes of fraud in science, as well presenting predictions on modes of response that mitigate, remediate or prevent the crimes. Such a study would be a topic in criminology. This paper is a framework for that investigation, and is necessarily brief. It is a partial requirement for a 1-credit undergraduate class, for which the length was set at five to ten pages.

Two literature searches revealed nothing of professional quality by practicing criminologists. Several of the sources cited here did draw on formal theory, though none appeared in a criminology journal. Recent popular books include Plastic Fantastic by Eugenie Samuel Reich and Voodoo Science (and other titles) by Dr. Robert L. Park. However, Fads and Fallacies in the Name of Science by Martin Gardner antedates them by half a century.[1] When that book was first issued, procedural misconduct by scientists was considered rare and unusual. Whether it was so objectively may be putative.

From a criminological viewpoint, the lack of formal, institutional and bureaucratic reporting mechanisms is a structural impediment. The Uniform Crime Reports of the Department of Justice do attract valid criticisms, but they are nonetheless direct reports by local police on defined categories of crimes. Also, the National Crime Victimization Survey is a self-reporting statistical sampling that easily shows about twice the numbers in all categories versus the UCR. The latitudes in those both being what they are, nothing like them exists for scientific research.

Several differently framed searches in two large academic databases supported the easy claim that misconduct, fraud and hoax were unknown in science before 1950[2] and remain rare today. The JSTOR database contains three million full length articles from 1200 journals, almost all peer reviewed academic periodicals, with significant titles going back before 1900. Searches uncovered zero articles about misconduct in scientific research from 1900 to 1950. From 1950 to the present, 40 articles addressed the related problems. An obvious upswing occurred after 1987. The Gale CENGAGE Learning PowerSearchTM database covering 1980-2010 yielded 56 titles. Of course, many were reports of headline news about Jan Hendrik Schön and other immediacies in (nominally) peer-reviewed news magazines such as Science, Nature and Science News.

On the other hand, a relatively recent academic research survey reported that misconduct is shockingly common. Fifteen years ago, “A Social Control Perspective on Scientific Misconduct” by Edward J. Hackett appeared in The Journal of Higher Education (Vol. 65, No. 3, pp. 242-260). A survey covering the five years 1983-1988 of members of the Council of Graduate Schools found that 40% (118) received allegations of possible misconduct. Those institutions whose external funding exceeded $50 million “were far more likely than others (69% to 19%) to hear such allegations.” The same article cited a 1991 survey of AAAS members in which 27% of 469 respondents claimed to have “personally encountered or witnessed scientific research that they suspected was fabricated, falsified or plagiarized during the past ten years.” In that same issue of the JHE, Mary Frank Fox wrote: “During the last fifteen years, hardly a year has gone by without the surfacing of a notorious case of misconduct in science.” She then cited nine by name.

In science, we seek to do more than catalog. John M. Braxton, then of Syracuse University, now with Peabody College at Vanderbilt University, drew from the paradigms of criminology in three papers. In the first two,[3] he relied on the 1977 Survey of the American Professoriate by Ladd and Lipset[4] in which 4,383 respondents answered questions about “the general status and financial status of American higher education, involvement in research, academic standards, faculty organization and representation, and respondent background.” Selecting appropriate subsets of the data, Braxton sought to operationalize and test for adherence to the norms of science. He also attempted to account for deviation from those norms. The first paper tested control theory, the second anomie.

Control theory actually encompasses several competing constructs, beginning with the work of Emile Durkheim and including Power Control (John Hagan, et al.), Self-Control (Gottfredson and Hirschi), Social Bond Theory (Travis Hirschi, pro se), Control Balance (Charles Tittle), and Differential Coercion (Mark Colvin). John M. Braxton utilized only the first. Braxton relied, also, on Robert K. Merton’s Sociology of Science (University of Chicago Press, 1973). According to Merton, science demands univeralism, communality, disinterestedness and organized skepticism. The first means that truth is independent of the speaker. Communality implies obligations to report to others as well as to acknowledge the works of others. Disinterestedness is the invitation to others to criticize one’s work. Organized skepticism is the bifurcated response to disinterestedness: we check the works of others; and we objectively question our own assertions.

Anomie is another concept from Durkheim. In attempting to explain why wealthy Protestants commit suicide more often than poor Catholics, Durkheim theorized that when social integration is lacking, the isolated individual deviates from the norm. Robert Merton also built on this theory, extending and expanding it. Merton’s theory of anomie may be better known to academics who are outside of sociology and criminology. It is Merton whom Braxton cited when explaining anomie.

In the first case, Braxton found that the perception that one’s colleagues conform to the norms of science is a stronger influence on actual behavior than is an internal standard. In the second case, Braxton found that perceptions of unfairness in the bestowal of awards – from social recognition to tenure and salary – correlate to deviation from the norms of science. This squares with Hackett’s paper in the same volume. Hackett cited three workable theories to explain the problem: individual psychopathy, anomie, and alienation.

In a third paper, published between the appearances of those,[5] Braxton reported that those institutions that are perceived as being of higher quality enforce stricter standards of conduct. The results came from a two-stage cluster sampling based on 138 completed responses from 300 department chairs at 100 institutions.

Braxton missed several important features of Merton’s theory, the most cogent of which is that in Merton’s language those who break the rules to gain the goals are called “innovators.” Those who deny the validity of both the goals and the rules are often the true inventors – at least in Merton’s view. An easy example of that from outside of crime and tort comes from computing. Forty years ago, programmers queued up at mainframe computers to submit decks of punched cards which were symbolic of the dehumanizing actions of technology on humanity. Bill Gates, Steve Jobs, and “the pirates of Silicon Valley” denied the validity of the goals and the means. That being true, we do not want to grant moral standing to scientists who innovate new paths around objectivity.

Understanding the phenomenon of misconduct in scientific research should begin with positives, rather than with negatives. We worry about wrongdoing without defining what it means to do right. We post the Periodic Table in our classrooms and laboratories, but we do not display the “Guidelines for Professional Conduct” of the American Physical Society. Moreover, the APS webpage for that has links to the ethics statements of the American Chemical Society, American Mathematical Society, the Association for Computing Machinery and the Institute for Electrical and Electronic Engineers. Those are all very short statements. The ethics code of the American Counseling Association runs 18 pages. On the other hand, a standard first year textbook for physics majors requires about 400 pages. But all physics problems come down to the conservation of energy. So, why not just teach physics as one page of important points and leave the rest to interpretation? Morality and ethics[6] in science should be at least a one semester class in solving problems, on par with any other science course, with three or four hours per week of engagement.

If the family is supposed to be the source of learning about ethics, then why not leave the family in charge of physics? How else can one learn to ride a bicycle, ice skate, or make hot tea except by the understanding and application of physics? Of course, knowing how to do those things is not the same thing as knowing physics. The truth is that physics and ethics are both complicated. We choose to think long and hard about physics. We work hard at developing tests of our assumptions. Meanwhile, we ignore ethics. The laws of cause and effect being objective, universal and immutable, choosing to remain ignorant about ethics has direct and substantial consequences.

When we teach science to youngsters, we present it in one or more related discussions of the search for truth, or the discovery of knowledge, or more subtly, as a method for seeking those. In K-12 classrooms, posters of the scientific method or the experimental method list three, five, nine or some other number of “steps.” But we never teach the scientific method as an ethical system.

After all, how do you know the difference between right and wrong? Do we not perceive a problem, gather information about it, form a hypothesis, test that assertion, and report our results? Or do we just blunder randomly through life, taking our knocks, without learning from our mistakes? Philosophers and theologians have presented morality and ethics according to many theories, beliefs, faiths and assertions including absolutist, formalist, deonotological, relativist, subjective, objective, and many, many more. None has proved either explanatory or predictive superiority. Studying ethics as a rational-empirical, experimental science seems attractive, especially for those who intend careers in science.

Strong psychological and epistemological reasons impel the teaching of ethics as a required class in science. Several psychological theories explain the failure to think. Freud suggested that we repress dangerous thoughts in order to avoid sexual conflicts within the family. Leon Festinger posited that we rationalize bad decisions that cannot be changed in order to reduce the “cognitive dissonance” that they cause. According to Nathaniel Branden, the effort of thinking can be avoided by relying on directions from others; and the preview of potential conflict with those norms causes a “blank out” or a meta-choice not to think.

Regardless of the operant explanations for avoidance, suppression, and repression, mandatory university classes in ethics would force conceptual awareness of the issues. This could mitigate the occurrences of misconduct in science. Given the positive incentives to think about ethical issues, remediating the failures would then also be the responsibility of the universities. They are and must remain the final arbitrators, adjudicators and mediators.

Positive learning will avoid many problems. However, failures are inevitable. How do we remediate the losses? What should be the consequences for plagiarism, falsification, fraud, trading sex for data, and tampering with the work of a competitor?[7]

Individual psychopathology is the one cause of crime for which no social prevention is possible.

That is not the same as “rational choice” by which perpetrators take advantage of favorable circumstances. In line with this theory, the “crime triangle” consists of a willing perpetrator, an available victim and the lack of a capable guardian. That last could be only a neighbor standing on a porch, watching the street. Many studies demonstrate the calculating nature of criminals.[8] Raise the opportunity cost and you lower the crime rate. Social controls are part of that, of course. Teaching the norms and enforcing them is necessary, but not sufficient.

Criminal penalties only came in those cases where the researchers cheated the government. Cheating a private entity is a matter for civil suit. Where no valuable consideration has been exchanged, i.e, no fraud occurred, university censure is the only control. Thus, on June 28, 2006, Eric Poehlman was sentenced to a year and a day in prison for defrauding the National Institutes of Health.[9] On the other hand, Jan Hendrik Schön was stripped of his doctorate and barred from participating as a judge in peer reviewed journals. It may be that the academic community should continue to adjudicate for itself.

We look to the government for law enforcement, but the word “police” appears nowhere in the U.S. Constitution. Today, prosecutors complain that private companies hush up fraud, larceny and embezzlement to avoid the market consequences of negative publicity. However, handling these problems within the firm has advantages.

In responding to and resolving the criminal behavior of employees, organizations routinely choose options other than criminal prosecution, for example, suspension without pay, transfer, job reassignment, job redesign (eliminating some job duties), civil restitution, and dismissal...

While on the surface, it appears that organizations opt for less severe sanctions than would be imposed by the criminal justice system, in reality, the organizational sanctions may have greater impact... In addition, the private systems of criminal justice are not always subject to principles of exclusionary evidence, fairness, and defendant rights which characterize the public criminal justice systems. The level of position, the amount of power, and socio-economic standing of the employee in the company may greatly influence the formality and type of company sanctions. In general, private justice systems are characterized by informal negotiations and outcomes, and nonuniform standards and procedures among organizations and crime types.[10]

Private justice has deep, old roots which, in fact, nourished the growth of universities as independent entities. Roman law acknowledged collective entities such as flocks of sheep and cities of people that continued independent of the lives of individual members. Corporate charters were granted to guilds of firefighters, burial societies and other voluntary civic bodies. Through the centuries, the church took over administration of the empire. About 1150 AD independent lecturers and their students came to Paris. Within the next generation, they received from the pope a charter for incorporation as a guild, thus being independent of king and bishop.

Oxford’s independence came from its isolated locale. Teachers and their students clustered there to be apart from the wider world. Although the bishop at Ely appointed the chancellor in 1214, by the end of the century the masters elected their board. In Cambridge, the university – also rooted in loosely connected lecturers and their students – applied for a royal charter, and continued to petition for renewal with each change of monarch. Cambridge also received papal recognition in 1233. In the 14th and 15th centuries, German universities were often founded by a local prince or baron and less often a bishop, but nonetheless continued in the tradition of a corporation, independent of the noble house. Universitas referred not to the school per se but to the law of incorporation which recognized a collective entity. Thus, universities always had the right and obligation of independent governance.[11]

The disconnect comes from the fact that the ethical standards are created by professional societies whose enforcement powers are limited. The American Physical Society could do little to Jan Hendrik Schön (even if he had been a member), but the University of Konstanz stripped him of his doctorate even though they found no fault with his thesis. If universities created rigorous curricula in ethics, that would close the information loop, completing the feedback cycle, controlling the proportional, integrated or differential variances by scientists from the norms of moral and ethical practice.

[1] First published as an article in the Antioch Review in 1950, the 1952 Putnam Brothers book was reprinted by Dover in 1957; and with at least 30 more runs, the work has never gone out of print.

[2] The Piltdown Hoax had little impact on academic research.

[3] Braxton, John M. “Deviancy from the Norms of Science: A Test of Control Theory” Research in Higher Education, Vol 31., No 5., (Oct. 1990), pp 461-476.

Braxton, John M. “Deviancy from the Norms of Science: The Effects of Anomie and Alienation in the Academic Profession,” Research in Higher Education, Vol 34., No 2., (Apr. 1993), pp 213-228.

[4] Ladd, E. C., Jr., and Lispet, S. M. (1978) Technical Report 1977 Survey of the American Professoriate, Storrs, Connecticut, Social Science Data Center, University of Connecticut. Distributed by ERIC Clearinghouse, 1978. A copy is archived at Michigan State University, but is not available for interlibrary loan. The work was funded by the Stanford University Hoover Institution on War, Revolution, and Peace.

[5] Braxton, John M. “The Influence of Graduate Department Quality on the Sanctioning of Scientific Misconduct,” The Journal of Higher Education, Vol. 62, No. 1 (Jan. - Feb., 1991), pp. 87-108

[6] I believe that I can prove that morality is an objective requirement of individual survival, e.g., the Greek idea of the good life, whereas ethics is conformance to social expectations, such as when on a city bus, a gentleman gives his seat to a woman. An example of an arguable ethic in science is the problem of whether and when to keep findings unreported versus your right to control your intellectual property or versus your obligation to protect others from immoral application of your work by third parties. This paper cannot address all of that, but those would be some of the lemmas and dilemmas in a 400-page Ethical Problems textbook.

[7] For examples of those last two, see Buzzelli, Donald E., “The Definition of Misconduct in Science: A View from NSF,” Science, New Series, Vol. 259, No. 5095 (Jan. 29, 1993), pp. 584-648.

[8] Di Tella, Rafael and Ernesto Schargrodsky. 2004. “Do Police Reduce Crime? Estimates Using the Allocation of Police Forces after a Terrorist Attack,” The American Economic Review, Vol. 94, No. 1, pp. 115-133.

[9] The False Claims Act of 1963 (Amended 1986) makes it a crime to lie to the government in a contract. The same law provides monetary rewards to whistleblowers.

[10] The Hallcrest Reports: Private Security and Police in America, William C. Cunningham and Todd Taylor, Stoneham, Mass. Butterworth-Heinemann, 1985. “This publication reports a 30-month descriptive research project performed by Hallcrest Systems, Inc., MacLean, Virginia, under a grant from the National Institute of Justice, U.S. Department of Justice.”

[11] Duryea, Edwin D., The Academic Corporation: A History of College and University Governing Boards, Falmer Press (Taylor & Francis Group), New York and London, 2000. Leedham-Green, Elisabeth, A Concise History of The University of Cambridge, Cambridge University Press, 1996. Paulsen, Friedrich, The German University and University Study, Charles Scribner, 1906. Hart, James Morgan. German Universities: A Narrative of Personal Experience, G. P. Putnam, 1874.