by Janis Costello Ingham, University of California, Santa Barbara and Jennifer Horner, Medical University of South Carolina, Charleston
The four cases below are designed for your personal reflection or for seminar discussion. Each follows the same format: background facts, issue(s), legitimate expectations, consequences, and obligations.
The purpose of this supplement to TheASHA Leader article on the responsible conduct of research by Ingham and Horner ([2004, March 16]. Ethics and research. The ASHA Leader, pp. 10–11, 24.) is to engage audiologists and speech-language pathologists in a dialogue about the responsible conduct of research within our professions. We encourage all readers to explore the reading list provided below. The piece by Ingham and Horner stems from an NIH-funded research project for which Sharon Moss serves as principal investigator ("Research Integrity in ASHA: Education and Publication").
After providing a few salient facts, each case identifies an "issue;" the "legitimate expectations" of individuals or agencies, or, by implication, the public at large who might be directly or indirectly affected by the conduct in the scenario; and some of the "consequences" of the hypothetical ethics quandaries. The last paragraph in each case, entitled "obligations," offers reasons why the conduct at issue entails moral concerns. These scenarios have been inspired by real cases or by literature on specific research ethics topics.
The cases presented are "thin." This means that we have provided only brief summaries of fictionalized factual scenarios for the purposes of illustration. In actuality, ethics issues are complex, and they pose real problems. They are complex because they require an analysis of values and a balancing of benefits and harms. Ethics is not "black-or-white"; rather, it requires an analysis of information and reasoning about the values at stake.
As you read the cases, consider as many sides of the argument as you can, and, most importantly, consider the moral obligations involved as well as the foreseeableconsequences of the ethically suspect conduct.
For instruction on the topic of moral reasoning that has informed the structure of these hypothetical cases, we recommend the work by Bebeau and colleagues, Indiana University, titled Moral Reasoning in Scientific Research (1995). For introductory books on the topic of research ethics, see especially Elliott and Stern (1997), Penslar (1995), Resnik (1998), and Shamoo and Resnik (2003). For a discussion of plagiarism, see Dartmouth College’s online publication Sources (1998). For full bibliographic references, see the reading list below. In each hypothetical vignette, explore the issues raised in the case. How serious is the problem that arose? Do you think the problem should prevent the scientist from pursuing a successful academic career? Why or why not?
CASE NO. 1: KARRIE—SCIENTIFIC MISCONDUCT/MENTOR-TRAINEE RESPONSIBILITIES
Karrie was an intelligent, ambitious young graduate student researcher. She chose her doctoral program on the basis of the reputation of one of the lead researchers in the discipline. She was thrilled to be accepted by her mentor and believed she was well on her way to a full-fledged academic career. As soon as she arrived at the university, she became involved in one of her mentor’s research projects. As a research assistant in her mentor’s lab, Karrie was responsible for monitoring and collating data generated from an online survey. Because she was involved in several academic courses and had a part-time job outside of school, Karrie found herself under extreme deadline pressure. With time running out, she filled out a number of online survey forms herself and added these results to the master database. When Karrie’s mentor later discovered that these data were fabricated, Karrie asserted that the mentor had never told her about scientific misconduct or principles of the responsible conduct of research.
Analysis:
- Issues: 1) Mentors have a responsibility to educate trainees about scientific misconduct; 2) Students can avoid responsibility for ethical lapses merely because they have not received direct instruction in the "responsible conduct of research."
- Legitimate expectations: 1) Students expect mentors to discuss research ethics issues, to explain the expectations of the scientific community and the public regarding research integrity, to provide opportunities to discuss the consequences of ethics lapses, and to articulate the obligations of all persons involved in the research process; 2) Mentors expect students to know the difference between right and wrong (honesty and dishonesty), whether or not they know the technical definition of "scientific misconduct."
- Consequences: Students who falsify or fabricate data and then shirk responsibility for those acts, and research mentors who excuse such behavior, dilute one of the basic tenets of scientific integrity—truthfulness.
- Obligations: Students have an obligation to adhere to the norm of truthfulness in all aspects of academic work, whether or not explicit instruction in research ethics is provided, because truth-telling is part of "common morality" and pervades all aspects of communal life.
Case adapted from: Office of Research Integrity (U.S. Department of Health and Human Services). (1999). Case Summaries, Karrie Recknor, University of Washington. ORI Newsletter, 8(1), 10.
CASE NO. 2: MARY—ADVERSE EVENT REPORTING
Mary, an associate professor, was becoming well established as a productive speech scientist. She worked in a high-pressure academic department in which the race for publications and grants was intense. Mary was building her reputation on studies of EMG biofeedback. One of her projects involved individuals with flaccid dysarthria acquired after experiencing brainstem strokes. A few of her research participants—those with sensorimotor impairment in the distribution of the Trigeminal nerve—experienced significant facial pain during the experimental procedure. Because all individuals had given their informed consent, she believed her work was ethically sound, so she did not report any adverse events in her published manuscript. Another researcher had a special interest in EMG and based his research on Mary’s work. Unfortunately, two of his research participants developed chronic facial pain. When this researcher looked back at Mary’s work, he could find no indication of this type of adverse reaction. He was stymied and disappointed by his results and the time and financial resources he had spent pursuing this research path. His granting agency was disappointed, too, and discontinued his funding.
Analysis:
- Issues: (1) Whether a researcher has an obligation to report unanticipated adverse events experienced by research participants; (2) Whether editors should require the reporting of adverse events in published articles.
- Legitimate expectations: 1) Scientists expect to be able to rely on published research—not only evidence regarding benefit, but also evidence regarding adverse events experienced by research participants; 2) Institutional review boards and granting agencies expect researchers to report all serious adverse events that occur during the course of a study, whether anticipated or not.
- Consequences: Researchers’ failure to report adverse events undermines the integrity of the research process and record. This failure has the potential to harm future research participants and patients, as well as the design of research studies that rely on published reports.
- Obligations: Researchers have obligations, both ethical and regulatory, to report all adverse events occurring during the course of research, whether anticipated or not. Failure to do so breaches the general principle of truthfulness.
Inspiration for this case: Kolata, G. (2001, July 20). U.S. suspends human research at Johns Hopkins after a death. The New York Times.
CASE NO. 3: WES—PEER REVIEW
Wes completed his dissertation and was beginning to apply for an academic position. He was an honest and energetic individual, who had benefited enormously from a close working relationship with his mentor. In anticipation of an assistant professor position, Wes submitted a grant to the National Institutes of Health in his specialty area, the genetics of hearing loss. Despite the fact that his grant was not funded, he continued to conduct small research projects and avidly read journals that addressed his special interest area. About a year after his disappointing submission to NIH, he read a research article that sounded very familiar. Upon closer review, he realized that someone on the grant review panel must have stolen his ideas and published them as her own. Disheartened and disillusioned, he gave up his dream of becoming a scientist.
Analysis:
- Issues: 1) Whether a grant applicant has a right to expect confidentiality in the peer review process; 2) Whether grant reviewers have a duty of nondisclosure regarding grant applications submitted to them.
- Legitimate expectations: 1) Researchers—including first-time grant applicants who submit grants to foundations, private corporations, and governmental agencies—expect their submissions to be treated confidentially; 2) All scholars expect to be given credit for their work; no one has a right to misappropriate the original work of another person.
- Consequences: Peer reviewers who misappropriate original ideas embodied in grant applications undermine the grant review process and potentially instill mistrust and cynicism among scientists.
- Obligations: Peer reviewers are obligated to honor the confidentiality of grant applications and to safeguard the original ideas embodied in grant applications. To allow peer reviewers to ignore the rule of confidentiality would effectively endorse unethical behaviors (such as misappropriation/theft, misrepresentation, fraud, plagiarism, and copyright infringement). Ultimately, disregard of confidentiality in the peer review process could mean the demise of the peer review process.
Inspiration for this case: Wesseley, S. (1998). Peer review of grant applications: What do we know? The Lancet, 352(9124), 301–305.
CASE NO. 4: MICHAEL—INFLATING THE RECORD
Michael, a doctoral student, was an aspiring language scientist. He worked in a laboratory in which the competition among graduate students was extreme, and students were required to submit their own NIH grant applications. In his application, Michael listed 11 manuscripts as "accepted" or "in press." All had been rejected for publication, but he fully expected that, with a few revisions, they would eventually published. A year later, Michael had received an NIH grant. He was ready to move on to an academic position. Coincidentally, one of the members of the search committee had been a reviewer on Michael’s successful NIH grant application. He wanted to share Michael’s publications with the search committee, but could not find them in MEDLINE or any other database. When he asked Michael for reprints, Michael stated they were still "in press." Unfortunately, Michael could not produce editors’ letters of acceptance, and the search committee ultimately declined to invite him for an interview.
Analysis:
- Issue: Whether manuscripts submitted for publication may be cited as "accepted" or "in press" in a grant application.
- Legitimate expectations: Readers (e.g., funding agencies, search committees) expect scientists’ curricula vitae to be accurate and truthful.
- Consequences: A grant applicant’s misrepresentation of his manuscripts as "accepted" or "in press" when they have been submitted for publication but not yet accepted, inflates the applicant’s credentials, undermines the competitive process, and misleads peer reviewers.
- Obligations: Students, faculty, and other researchers have a responsibility to be truthful in all aspects of their research and scholarly activities, including reporting their credentials and scholarly record accurately, because the peer review process and ultimately the integrity of the scholarly record of the discipline depend on honest reporting.
Case adapted from: Office of Research Integrity (U.S. Department of Health and Human Services). (2001). Case Summaries, Michael K. Hartzer. ORI Newsletter, 9(2), 6.
Altman, L. K. (1996). The Ingelfinger rule, embargoes, and journal peer review—Part I. The Lancet, 347, 1382–1386.
American Speech-Language-Hearing Association.Code of Ethics. Available online: www.asha.org.
American Journal of Law & Medicine. (1998). Horner, J. (Symposium Editor), Law, medicine and socially responsible research (Symposium Issue), 24 (2&3). Boston, MA: American Society of Law, Medicine & Ethics and Boston University School of Law.
Angell, M. (2000). Is academic medicine for sale? (Editorial). New England Journal of Medicine, 342, 1516–1518.
Angell, M., & Kassirer, J. P. (1996). Editorials and conflicts of interest. New England Journal of Medicine, 335, 1055–1056.
Bailar, J. C. (1986). Science, statistics, and deception. Annals of Internal Medicine, 104, 259–260.
Beauchamp, T. L., & Childress, J. F. (1994). Principles of biomedical ethics (4th ed.). New York: Oxford University Press.
Bebeau, M. J., Pimple, K. D, Muskavitch, K. M. T., Borden, S. L., & Smith, D. H. (1995). Moral reasoning in scientific research. Bloomington, IN: Indiana University. Available online: http://www.poynter.indiana.edu/mr.pdf [site visited 01/09/04].
Berns, K. I., & Manning, F. J. (Eds.). (1996). Committee on Resource Sharing in Biomedical Research, Division of Health Sciences Policy, Institute of Medicine. Resource sharing in biomedical research. Washington, DC: National Academy Press. Available online: http://stills.nap.edu/html/biomed [site visited 01/09/04].
Bingham, C. (1998). Peer review on the Internet: A better class of conversation. The Lancet, 351, S110–S114.
Blumenthal, D., Campbell, E. G., Causino, N., & Louis, K. S. (1996). Participation of life-science faculty in research relationships with industry. New England Journal of Medicine, 335, 1734–1739.
Blumenthal, D., Causino, N., Campbell, E., & Louis, K. S. (1996). Relationships between academic institutions and industry in the life sciences—An industry survey. New England Journal of Medicine, 334(6), 368–373.
Burk, D. L. (1995). Research misconduct: Deviance, due process, and the disestablishment of science. George Mason Independent Law Review, 3, 305.
Callahan, E. S., & Dworkin, T.M. (2000). The state of state whistleblower protection. The American Business Law Journal, 38, 99.
Chalmers, I. (1999, Dec. 20). When silence is lethal. Chemistry and Industry, 961.
Chung, A. W. (2001). Resuscitating the constitutional “theory” of academic freedom: A search for a standard beyond Pickering and Connick. Stanford Law Review, 53, 915.
Chung, J. (2001). Does simultaneous research make an invention obvious? The 35 U.S.C. s 103 nonobvious requirement for patents as applied to the simultaneous research problem. Albany Law Journal of Science & Technology, 11, 337.
Committee on Science, Engineering, and Public Policy, National Academy of Sciences, National Academy of Engineering & Institute of Medicine. (1995). On being a scientist: Responsible conduct in research. Washington, DC: National Academy Press.
Committee on Science, Engineering, and Public Policy, National Academy of Sciences, National Academy of Engineering & Institute of Medicine. (1997). Adviser, teacher, role model, friend: On being a mentor to students in science and engineering. Washington, DC: National Academy Press.
Court, C., & Dillner, L. (1994). Obstetrician suspended after research inquiry. BMJ, 309, 1459.
Cretsinger, C. E., & Menell, P. S. (2001). Annual review of law and technology: Foreword. Berkeley Technology Law Journal, 16, 1.
Culliton, B. J. (1990). Inside the Gallo probe. Science, 448, 1494–1498.
Dartmouth College. (1998). Sources: Their use and acknowledgment. Hanover, NH: Trustees of Dartmouth College. Available online: www.dartmouth.edu/~sources [site visited 01/09/04].
Dingell, J. D. (1993). Misconduct in medical research (Shattuck Lectures). New England Journal of Medicine, 328, 1610.
Dreyfuss, R. C. (2000). Collaborative research: Conflicts on authorship, ownership, and accountability. Vanderbilt Law Review, 53, 1162.
Eastwood, S., Derisk, P., Leash, E., & Ordway, S. (1996). Ethical issues in biomedical research: Perceptions and practices of postdoctoral research fellows responding to a survey. Science and Engineering Ethics, 2, 89–114.
Elliott, D., & Stern, J. E. (1997). Research ethics: A reader. Hanover, NH: University Press of New England.
Ferguson, J. R. (1997). Biomedical research and insider trading (Sounding Board). New England Journal of Medicine, 337, 631–634.
Fotion, N., & Conrad, C. C. (1984). Authorship and other credits. Annals of Internal Medicine, 100, 592–594.
Friedberg, M., Saffran, B., Stinson, T. J., Nelson, W., & Bennett, C. L. (1999). Evaluation of conflict of interest in economic analyses of new drugs used in oncology. JAMA, 282, 1453.
Garry, R. F. (1999). BMJ’s editors should publish their own conflicts of interest regularly (Letter). BMJ, 318, 464.
Goldner, J. A. (1998). The unending saga of legal controls over scientific misconduct: A clash of cultures needing resolution. American Journal of Law & Medicine, 24, 293–343.
Goodman, N. W. (1994). Survey of fulfillment of criteria for authorship in published medical research. BMJ, 309, 1482.
Grodin, M. A. (Ed.). (1995). Meta medical ethics: The philosophical foundations of bioethics. Boston: Kluwer.
Hamilton, J. B., Greco, A. J., & Tanner, J. R. (1997). Ethical questions regarding joint authorship: Business and nonbusiness faculty perceptions on noncontributing authorship. Journal of Education for Business, 72(6), 325–330.
Heitman, E. (2000, July). Ethical values in the education of biomedical researchers. The Hastings Center Report, 30, S40.
Hixson, J. (1976). The patchwork mouse. Garden City: Anchor Press/Doubleday.
Horner, J. (2003). Morality, ethics, and law: Introductory concepts. In ethical, moral and legal issues in speech and language pathology. In N. Helm-Estabrooks & N. Bernstein Ratner (Eds.) & J. Horner (Guest Ed.), Seminars in Speech and Language, 24(4), 263–274.
Horton, R. (1999). Scientific misconduct: Exaggerated fear but still real and requiring a proportionate response. The Lancet, 354, 7.
Horton, R., & Smith, R. (1996). Time to redefine authorship (Editorial). BMJ, 312, 723.
Howard, E. (1994). Science misconduct and due process: A case of process due. Hastings Law Journal, 45, 309.
Huston, P., & Moher, D. (1996). Redundancy, disaggregation, and the integrity of medical research. The Lancet, 347, 1024–1026.
Huth, E. J. (1986). Irresponsible authorship and wasteful publication. Annals of Internal Medicine, 104, 257–259.
Ingham, J. C. (2003). Research ethics 101: The responsible conduct of research. In Ethical, moral and legal issues in speech and language pathology. In N. Helm-Estabrooks & N. Bernstein Ratner (Eds.) & J. Horner (Guest Ed.), Seminars in Speech and Language, 24(4), 323–337.
International Committee of Medical Journal Editors. (1993). Uniform requirements for manuscripts submitted to biomedical journals. JAMA, 269, 2282–2286.
Kalichman, M. W. The online resource for instruction in the responsible conduct of research. Available online: http://ethics.ucsd.edu/resources/resources-training.html.
Kassirer, J. P. (1993). The frustrations of scientific misconduct. New England Journal of Medicine, 328, 1634–1636.
Kulynych, J. (1998). Intent to deceive: Mental state and scienter in the new uniform federal definition of scientific misconduct. Stanford Technology Law Review. Available online: http://stlr.stanford.edu/STLR/Articles/98_STLR_2/ [site visited 02/02/04].
Kuzma, S. M. (1992). Criminal liability for misconduct in scientific research. University of Michigan Journal of Law Reform, 25, 357.
LaFollette, M. C. (1992). Stealing into print: Fraud, plagiarism, and other misconduct in scientific publishing. Berkeley: University of California Press.
LaFollette, M. C. (1994). The pathology of research fraud: The history and politics of the U.S. experience. Journal of Internal Medicine, 235, 129–135.
Leach, R. E. (2001). Honesty—A fundamental trait. The American Journal of Sports Medicine, 29, 1.
Line, S. (1993). Scientific misconduct: A form of white coat crime. The Journal of Pharmacy & Law, 2, 15.
List, C. J. (1984, Fall.). Scientific fraud: Social deviance or the failure of virtue? Science, Technology & Human Values, 10, 27–36.
Lundberg, G. D., & Flanagin, A. (1989). New requirements for authors: Signed statements of authorship responsibility and financial disclosure. JAMA, 262, 2003–2004.
McGarity, T. (1994). Peer review in awarding federal grants in the arts and sciences. High Technology Law Journal, 9, 1–92.
Mertelsmann, R. (2001). Haematologist may face disciplinary action for research fraud. BMJ, 322, 694.
National Bioethics Advisory Commission. (2001). Ethical and policy issues in research involving human participants. (Draft Report). Bethesda, MD, December 19, 2000. Final recommendations, May 18, 2001. Available online: http://bioethics.gov/#final [site visited 01/09/04].
Nelkin, D. (1996). An uneasy relationship: The tensions between medicine and the media. The Lancet, 347, 1600–1603.
Office of Research Integrity, U.S. Department of Health and Human Services. ORI provides working definition of plagiarism. Available online: http://ori.dhhs.gov/policies/plagiarism.shtml.
Office of Research Integrity, U.S. Department of Health and Human Services. (1999). Guidelines for editors. ORI Newsletter, 7(3), 1–2.
Office of Research Integrity, U.S. Department of Health and Human Services. (1999). On the duty of faculty members to speak out on misconduct. ORI Newsletter, 7(3), 4.
O’Reilly, J. T. (2000). Elders, surgeons, regulators, jurors: Are medical experimentation’s mistakes too easily buried? Loyola University Chicago Law Journal, 31, 317.
Panel on Scientific Responsibility and the Conduct of Research (Committee on Science, Engineering, and Public Policy [COSEPUP], sponsored by the National Academy of Sciences, National Academy of Engineering, and Institute of Medicine). (1992). Responsible science: Ensuring the integrity of the research process (Vol. I). Washington, DC: National Academy Press.
Pascal, C. B. (2000). Scientific misconduct and research integrity for the bench scientist. Society for Experimental Biology and Medicine, 224, 220–230.
Penslar, R. L. (Ed.). (1995). Research ethics: Cases & materials. Bloomington: Indiana University Press.
Pfeifer, M. P., & Snodgrass, G. L. (1990). The continued use of retracted, invalid scientific literature. JAMA, 263(10), 1420–1423.
Prentice, R. A. (1999). Clinical trial results, physicians, and insider trading. Journal of Legal Medicine, 20, 2.
Ramsay, S. (2001). UK consultant censured for failure to act on junior’s research fraud. The Lancet, 357, 780.
Relman, A. S. (1983). Lessons from the Darsee affair. New England Journal of Medicine, 308, 1415–1417, cited by LaFollette, M. C. (1994). The pathology of research fraud: The history and politics of the U.S. experience. Journal of Internal Medicine, 235, 130.
Relman, A. S. (1979). An open letter to the news media. New England Journal of Medicine, 300, 554–555.
Rennie, D., & Flanagin, A. (1994). Authorship! authorship! guests, ghosts, grafters, and the two-sided coin. JAMA, 271, 469–471.
Resnik, D. B. (1998). Ethics of science: An introduction. London: Routledge.
Riesenberg, D., & Lundberg, G. D. (1990). The order of authorship: Who’s on first? (Editorial). JAMA, 264(14), 1857, citing Lundberg, G. D. & Flanagin, A. (1989). New requirements for authors: Signed statements of authorship responsibility and financial disclosure. JAMA, 262, 2003–2004.
Rosenberg, S. A. (1996). Secrecy in medical research (Sounding Board). New England Journal of Medicine, 334(6), 392–394.
Shalala, D. (2000). Protecting research subjects—What must be done? (Sounding Board). New England Journal of Medicine, 343(11), 808–810.
Shamoo, A. E., & Resnik, D. B. (2003). Responsible conduct of research. New York: Oxford University Press.
Shapiro, D. W., Wenger, N. S., & Shapiro, M. F. (1994). The contribution of authors to multiauthored biomedical research papers. JAMA, 271, 438–442.
Sheikh, A. (2000). Publication ethics and the research assessment exercise: Reflections on the troubled question of authorship. Journal of Medical Ethics, 26(6), 422–427.
Smith, J. (1994). Gift authorship: A poisoned chalice? (Editorial). BMJ, 309, 1456–1457.
Smith, R. (1997). Authorship: Time for a paradigm shift? (Editorial). BMJ, 314, 992.
Smith R. (1998). Beyond confidential: Transparency is the key. BMJ, 317, 291–292.
Spencer, F. (1990). Piltdown: A scientific forgery. New York: Oxford University Press.
Stern, J. E., & Elliott, D. (Eds.). (1997). Research ethics: A reader. Hanover, NH: University Press of New England.
U.S. Department of Health and Human Services. (1989). Responsibility of PHS awardee and applicant institutions for dealing with and reporting possible misconduct in science. Federal Register 54 (August 8), 32449, codified as 42 CFR Part 50 Subpart A.
Wesseley, S. (1998). Peer review of grant applications: What do we know? The Lancet, 352, 301–305.
Wilmhurst, P. (1997). The code of silence. The Lancet, 349, 457.
Woodward, B. (1999). Challenges to human subject protections in US medical research. JAMA, 282, 1947.