March 16, 2004 Features

Ethics and Research

A well-known researcher, highly successful in obtaining extramural research funding and having a long list of publications, insisted that her name be included as an author on all publications emanating from her laboratory. A clinical scientist, testing the effects of an experimental treatment, compared scores from a treated group of participants to scores from a group of individuals from whom treatment was withheld. A university professor, serving as a reviewer of a manuscript submitted for publication in a research journal, provided copies of the manuscript to graduate students so that they could prepare their own critiques, as an educational exercise in peer review…

…An ambitious graduate student, when analyzing data collected for his master's thesis, changed the numbers in two experimental conditions to make the results fit his hypothesis. A young assistant professor, eager to bolster her publication record in time for her tenure review, included published findings from another scientist in a manuscript submitted for publication and reported them as her own. A busy senior scientist, mentoring several PhD and postdoctoral students, neglected to monitor their methods of data collection.

What do these cases have in common? All of them raise questions about breaches of research ethics or of research misconduct. The latter, defined by the U.S. Department of Health and Human Services in language adhered to by the National Institutes of Health, is: "fabrication, falsification, or plagiarism, or other practices that seriously deviate from those that are commonly accepted within the scientific community for proposing, conducting, or reporting research. Misconduct does not include honest error or honest differences in interpretations or judgments of data."

Why should I be concerned about research misconduct? The consequences of violations of research ethics can be far-reaching. On moral grounds, society expects individuals to lead their lives with honesty and the utmost respect for the well-being of others (see Horner). These expectations are highest for those in positions of trust, who are responsible for the health, safety, and welfare of others-such as religious leaders, educators, and government officials. Ethical transgressions within these groups are considered especially abhorrent. And so it is with scientists, whose life's work is fundamentally about seeking truth (fact) and developing an understanding of natural and human phenomena. From the most theoretical of physicists to the most applied of clinical researchers, the underlying search for new knowledge is ultimately tied to the enhancement of human life. When those whose job it is to make discoveries and enrich our lives disregard principles of intellectual honesty, the credibility of science and those who perform that science is inestimably undermined.

What are the practical consequences of research misconduct? Recent confirmed cases of scientific misconduct are illustrative. In these cases fraudulent research activities derailed the development of a vaccine against hepatitis C, interfered with progress in understanding metastasis in prostate cancer, produced misleading data regarding auditory processing in Broca's aphasia, and misrepresented findings aimed at understanding chemical phenomena active in the edema associated with traumatic brain injury. The consequences of misconduct in one particular case, in which a graduate student falsified and fabricated data, were described as "adversely and materially affect[ing] the laboratory's ongoing research…by creating uncertainty about all his experimental results, necessitating verification and repetition of experiments, preventing the reporting of results for publication, and preventing the principal investigator from submitting a competitive renewal application for a NIH grant" (see ORI Annual Report). All of these examples underscore the point that the impacts of research misconduct are serious and far-reaching.

What should be done about research misconduct? As Chris Pascal, director of the Office of Research Integrity (ORI), has written recently, an important strategy to counteract occurrences of scientific misconduct is "…to instill key principles of responsible research into the mission, culture, and curricula of research institutions…In ORI's view, responsible research practices are critical to the quality of research. Education in these practices is necessary to develop researchers' skills and competencies not only in integrity issues, but also in the actual conduct of research."

Efforts to provide education regarding the responsible conduct of research (RCR) abound. (See the Web site of the newly formed RCR Education Consortium-http://rcrec.org-as an example, and the sidebar from Ingham that accompanies this article on page 24, for a typical curriculum.)

ASHA's Code of Ethics provides a modicum of guidance in regard to research ethics. Among the 18 statements dealing with this topic are items related to conflict of interest, assignment of credit and acknowledgment of sources, treatment of human research participants, and truthfulness (six items!).

Perhaps as important as any of these is item IV-I: "Individuals who have reason to believe that the Code of Ethics has been violated shall inform the Board of Ethics." All members of ASHA must adhere to this "duty to report" any known, verifiable acts of scientific misconduct so that the highest moral, ethical, and research standards are upheld in our discipline. Essentially all institutions also have reporting and investigatory mechanisms in place to deal with allegations of misconduct, and it is important to become familiar with the procedure used in one's place of employment. If each of us-scientist, teacher, student, clinician-accepts this difficult but important responsibility, and if RCR education becomes a meaningful part of our curricula, our discipline will advance not only the quality of what our research discovers, but also the ethics of how we manage the discovery process. Adherence to high research ethics standards is essential if we are to trust the science that underpins all that we do.

Janis Costello Ingham, is professor of speech and hearing sciences at the University of California, Santa Barbara. She served as ASHA's vice president for research and technology and is currently vice president for research and academic development for the Council of Graduate Programs in Communication Sciences and Disorders. The Researcher Town Meeting that she organized in 1998 was ASHA's first foray into issues of research ethics. Contact her by e-mail at jcingham@speech.ucsb.edu.

Jennifer Horner, is an associate professor in the College of Health Professions, Medical University of South Carolina. She is program director of communication sciences and disorders and chair of the department of rehabilitation sciences. Horner, who also holds a JD degree from Boston University School of Law, completed a fellowship at the University of Chicago's MacLean Center for Clinical Medical Ethics. Contact her by e-mail at hornerj@musc.edu.

cite as: Ingham, J. C.  & Horner, J. (2004, March 16). Ethics and Research. The ASHA Leader.

Case Studies

Karrie: Case No. 1

Karrie was an intelligent, ambitious, young graduate student researcher. She chose her doctoral program on the basis of the reputation of one of the lead researchers in the discipline. She was thrilled to be accepted by her mentor and believed she was well on her way to a full-fledged academic career. As soon as she arrived at the university, she became involved in one of her mentor's research projects. As a research assistant in her mentor's lab, Karrie was responsible for monitoring and collating data generated from an online survey. Because she was involved in several academic courses and had a part-time job outside of school, Karrie found herself under intense deadline pressure. With time running out, she filled out a number of online survey forms herself and added these results to the master database. When Karrie's mentor later discovered that the data were fabricated, Karrie asserted that it wasn't her fault because the mentor had never told her about scientific misconduct or responsible conduct of research principles.

Mary: Case No. 2

Mary, an associate professor, was becoming well established as a productive speech scientist. She worked in a high-pressure academic department in which the race for publications and grants was intense. Mary was building her reputation on studies of EMG biofeedback. One of her projects involved individuals with flaccid dysarthria acquired as a consequence of brainstem strokes. A few of her research participants-those with sensorimotor impairment in the distribution of the Trigeminal nerve-experienced significant facial pain during the experimental procedure. Because all individuals had given their informed consent, she believed her work was ethically sound, so she did not report any adverse events in her published manuscript. Another researcher had a special interest in EMG and based his research on Mary's work. Unfortunately, two of his research participants developed chronic facial pain. When this researcher looked back at Mary's work, he could find no indication of this type of adverse reaction. He was stymied and disappointed by his results and the time and financial resources he had spent pursuing this research path. His granting agency was disappointed, too, and discontinued his funding.

Wes: Case No. 3

Wes had completed his dissertation and was beginning to apply for academic positions. He was an honest and energetic individual, who had benefited enormously from a close working relationship with his mentor. In anticipation of an assistant professor position, Wes submitted a grant to the National Institutes of Health (NIH) in his specialty area, the genetics of hearing loss. Despite the fact that his grant was not funded, he continued to conduct small research projects, and avidly read a number of journals in his special interest area. About a year after his disappointing submission to NIH, he read a research article that seemed very familiar. Upon closer review, he realized that someone on the grant review panel must have stolen his original ideas and published them as her own. Disheartened and disillusioned, he gave up his dream of becoming a scientist.

Michael: Case No. 4

Michael, a doctoral student, was an aspiring language scientist. He worked in a laboratory in which the competition among graduate students was extreme, and students were required to submit their own NIH grant applications. In his application, Michael listed 11 manuscripts as "accepted" or "in press." All had been rejected for publication, but he fully expected that, with a few revisions, they would eventually be published. A year later, Michael had received an NIH grant. He was ready to move on to an academic position. Coincidentally, one of the members of a search committee had been a reviewer on Michael's successful grant application. He wanted to share Michael's publications with the search committee, but could not find them in MEDLINE or any other database. When he asked Michael for reprints, Michael stated they were still "in press." Unfortunately, Michael could not produce editors' letters of acceptance, and the search committee ultimately declined to invite him for an interview.



Core Instructional Areas for the Responsible Conduct of Research

Ethics and morality

The philosophic background, including moral development theory, required to support ethical decision-making and moral reasoning skills. 

Research misconduct

The meaning of research misconduct and the regulations, policies, and guidelines that govern research misconduct in PHS-funded institutions. Includes topics such as fabrication, falsification, and plagiarism; error vs. intentional misconduct; institutional misconduct policies; identifying misconduct; procedures for reporting misconduct; protection of whistleblowers; and outcomes of investigations, including institutional and federal actions. 

Human subjects protections

Issues important in conducting research involving human subjects. Includes topics such as the definition of human subjects research, ethical principles for conducting human subjects research, informed consent, confidentiality and privacy of data and patient records, risks and benefits, preparation of a research protocol, institutional review boards, adherence to study protocol, proper conduct of the study, and gender, minority, and children's research issues. 

Publication practices and responsible authorship

The purpose and importance of scientific publication, and the responsibilities of the authors. Includes topics such as collaborative work and assigning appropriate credit, acknowledgments, appropriate citations, repetitive publications, fragmentary publication, sufficient description of methods, corrections and retractions, conventions for deciding upon authors, authors responsibilities, and the pressure to publish. 

Peer review

The purpose of peer review in determining merit for research funding and publications. Includes topics such as the definition of peer review, impartiality, how peer review works, editorial boards and ad hoc reviewers, responsibilities of the reviewers, privileged information, and confidentiality. 

Mentor/trainee relationships

The responsibilities of mentors and trainees in predoctoral and postdoctoral research programs. Includes the role of a mentor, responsibilities of a mentor, conflicts between mentor and trainee, collaboration and competition, selection of a mentor, and abusing the mentor/trainee relationship. 

Conflict of interest and commitment

The definition of conflicts of interest and how to handle conflicts of interest. Types of conflicts encountered by researchers and institutions. Includes topics such as conflicts associated with collaborators, publication, financial conflicts, obligations to other constituencies, and other types of conflicts. 

Data acquisition, management, sharing, and ownership

Accepted practices for acquiring and maintaining research data. Proper methods for record keeping and electronic data collection and storage in scientific research. Includes defining what constitutes data; keeping data notebooks; data selection, retention, sharing, ownership, and analysis; and data as legal documents and intellectual property, including copyright laws. 

Collaborative science

Research collaborations and issues that may arise from such collaborations. Includes topics such as setting ground rules early in the collaboration, avoiding authorship disputes, and the sharing of materials and information with internal and external collaborating scientists. 

Animal subjects protection

Issues important to conducting research involving animals. Includes topics such as definition of research involving animals, ethical principles for conducting research on animals, federal regulations governing animal research, institutional animal care and use of committees, and treatment of animals. 

Guidelines and regulations

Identification and understanding of existing federal and local policies, methods of compliance, and consequences for not complying.

Adapted from the Office of Research Integrity and Kalichman. From Seminars in Speech andLanguage 2003, 24(4), 323 - 337. Reprinted with permission.


References

American Speech-Language-Hearing AssociationCode of Ethics. Available at www.asha.org. 

Horner, J. (2003). Morality, ethics, and law: Introductory concepts. In N. Helm-Estabrooks & N. Bernstein Ratner (Eds.) & J. Horner (guest Ed.), Ethical, moral and legal issues in speech and language pathology. Seminars in Speech and Language, 24(4), 263-274. 

Ingham, J. C. (2003). Research ethics 101: The responsible conduct of research. In N. Helm-Estabrooks & N. Bernstein Ratner (Eds.) & J. Horner (guest Ed.), Ethical, moral and legal issues in speech and  language pathology. Seminars in Speech and Language, 24(4), 323-337. 

Kalichman, M. W. The online resource for instruction in the responsible conduct of research. Available at http://ethics.ucsd.edu/resources/resources-training.html

ORI Annual Report-2002. Available at http://ori.dhhs.gov/

Pascal, C. B. (2003, November). Viewpoint: From the president to the principal investigator: Promoting the responsible conduct of research. AAMC Reporter

U.S. Department of Health and Human Services. (1989). Responsibility of PHS awardee and applicant institutions for dealing with and reporting possible misconduct in science. Federal Register 54 (Aug. 8), 32449, codified as 42 CFR Part 50 Subpart A.



Ethics and Research: Further Discussion and Resources

by Janis Costello Ingham, University of California, Santa Barbara and Jennifer Horner, Medical University of South Carolina, Charleston

The four cases below are designed for your personal reflection or for seminar discussion. Each follows the same format: background facts, issue(s), legitimate expectations, consequences, and obligations.

The purpose of this supplement to TheASHA Leader article on the responsible conduct of research by Ingham and Horner ([2004, March 16]. Ethics and research. The ASHA Leader, pp. 10–11, 24.) is to engage audiologists and speech-language pathologists in a dialogue about the responsible conduct of research within our professions. We encourage all readers to explore the reading list provided below. The piece by Ingham and Horner stems from an NIH-funded research project for which Sharon Moss serves as principal investigator ("Research Integrity in ASHA: Education and Publication").

After providing a few salient facts, each case identifies an "issue;" the "legitimate expectations" of individuals or agencies, or, by implication, the public at large who might be directly or indirectly affected by the conduct in the scenario; and some of the "consequences" of the hypothetical ethics quandaries. The last paragraph in each case, entitled "obligations," offers reasons why the conduct at issue entails moral concerns. These scenarios have been inspired by real cases or by literature on specific research ethics topics.

The cases presented are "thin." This means that we have provided only brief summaries of fictionalized factual scenarios for the purposes of illustration. In actuality, ethics issues are complex, and they pose real problems. They are complex because they require an analysis of values and a balancing of benefits and harms. Ethics is not "black-or-white"; rather, it requires an analysis of information and reasoning about the values at stake.

As you read the cases, consider as many sides of the argument as you can, and, most importantly, consider the moral obligations involved as well as the foreseeableconsequences of the ethically suspect conduct.

For instruction on the topic of moral reasoning that has informed the structure of these hypothetical cases, we recommend the work by Bebeau and colleagues, Indiana University, titled Moral Reasoning in Scientific Research (1995). For introductory books on the topic of research ethics, see especially Elliott and Stern (1997), Penslar (1995), Resnik (1998), and Shamoo and Resnik (2003). For a discussion of plagiarism, see Dartmouth College’s online publication Sources (1998). For full bibliographic references, see the reading list below. In each hypothetical vignette, explore the issues raised in the case. How serious is the problem that arose? Do you think the problem should prevent the scientist from pursuing a successful academic career? Why or why not?

CASE NO. 1: KARRIE—SCIENTIFIC MISCONDUCT/MENTOR-TRAINEE RESPONSIBILITIES

Karrie was an intelligent, ambitious young graduate student researcher. She chose her doctoral program on the basis of the reputation of one of the lead researchers in the discipline. She was thrilled to be accepted by her mentor and believed she was well on her way to a full-fledged academic career. As soon as she arrived at the university, she became involved in one of her mentor’s research projects. As a research assistant in her mentor’s lab, Karrie was responsible for monitoring and collating data generated from an online survey. Because she was involved in several academic courses and had a part-time job outside of school, Karrie found herself under extreme deadline pressure. With time running out, she filled out a number of online survey forms herself and added these results to the master database. When Karrie’s mentor later discovered that these data were fabricated, Karrie asserted that the mentor had never told her about scientific misconduct or principles of the responsible conduct of research.

Analysis:

  • Issues: 1) Mentors have a responsibility to educate trainees about scientific misconduct; 2) Students can avoid responsibility for ethical lapses merely because they have not received direct instruction in the "responsible conduct of research."
  • Legitimate expectations: 1) Students expect mentors to discuss research ethics issues, to explain the expectations of the scientific community and the public regarding research integrity, to provide opportunities to discuss the consequences of ethics lapses, and to articulate the obligations of all persons involved in the research process; 2) Mentors expect students to know the difference between right and wrong (honesty and dishonesty), whether or not they know the technical definition of "scientific misconduct."
  • Consequences: Students who falsify or fabricate data and then shirk responsibility for those acts, and research mentors who excuse such behavior, dilute one of the basic tenets of scientific integrity—truthfulness.
  • Obligations: Students have an obligation to adhere to the norm of truthfulness in all aspects of academic work, whether or not explicit instruction in research ethics is provided, because truth-telling is part of "common morality" and pervades all aspects of communal life.

Case adapted from: Office of Research Integrity (U.S. Department of Health and Human Services). (1999). Case Summaries, Karrie Recknor, University of Washington. ORI Newsletter, 8(1), 10.

CASE NO. 2: MARY—ADVERSE EVENT REPORTING

Mary, an associate professor, was becoming well established as a productive speech scientist. She worked in a high-pressure academic department in which the race for publications and grants was intense. Mary was building her reputation on studies of EMG biofeedback. One of her projects involved individuals with flaccid dysarthria acquired after experiencing brainstem strokes. A few of her research participants—those with sensorimotor impairment in the distribution of the Trigeminal nerve—experienced significant facial pain during the experimental procedure. Because all individuals had given their informed consent, she believed her work was ethically sound, so she did not report any adverse events in her published manuscript. Another researcher had a special interest in EMG and based his research on Mary’s work. Unfortunately, two of his research participants developed chronic facial pain. When this researcher looked back at Mary’s work, he could find no indication of this type of adverse reaction. He was stymied and disappointed by his results and the time and financial resources he had spent pursuing this research path. His granting agency was disappointed, too, and discontinued his funding.

Analysis:

  • Issues: (1) Whether a researcher has an obligation to report unanticipated adverse events experienced by research participants; (2) Whether editors should require the reporting of adverse events in published articles.
  • Legitimate expectations: 1) Scientists expect to be able to rely on published research—not only evidence regarding benefit, but also evidence regarding adverse events experienced by research participants; 2) Institutional review boards and granting agencies expect researchers to report all serious adverse events that occur during the course of a study, whether anticipated or not.
  • Consequences: Researchers’ failure to report adverse events undermines the integrity of the research process and record. This failure has the potential to harm future research participants and patients, as well as the design of research studies that rely on published reports.
  • Obligations: Researchers have obligations, both ethical and regulatory, to report all adverse events occurring during the course of research, whether anticipated or not. Failure to do so breaches the general principle of truthfulness.

Inspiration for this case: Kolata, G. (2001, July 20). U.S. suspends human research at Johns Hopkins after a death. The New York Times.

CASE NO. 3: WES—PEER REVIEW

Wes completed his dissertation and was beginning to apply for an academic position. He was an honest and energetic individual, who had benefited enormously from a close working relationship with his mentor. In anticipation of an assistant professor position, Wes submitted a grant to the National Institutes of Health in his specialty area, the genetics of hearing loss. Despite the fact that his grant was not funded, he continued to conduct small research projects and avidly read journals that addressed his special interest area. About a year after his disappointing submission to NIH, he read a research article that sounded very familiar. Upon closer review, he realized that someone on the grant review panel must have stolen his ideas and published them as her own. Disheartened and disillusioned, he gave up his dream of becoming a scientist.

Analysis:

  • Issues: 1) Whether a grant applicant has a right to expect confidentiality in the peer review process; 2) Whether grant reviewers have a duty of nondisclosure regarding grant applications submitted to them.
  • Legitimate expectations: 1) Researchers—including first-time grant applicants who submit grants to foundations, private corporations, and governmental agencies—expect their submissions to be treated confidentially; 2) All scholars expect to be given credit for their work; no one has a right to misappropriate the original work of another person.
  • Consequences: Peer reviewers who misappropriate original ideas embodied in grant applications undermine the grant review process and potentially instill mistrust and cynicism among scientists.
  • Obligations: Peer reviewers are obligated to honor the confidentiality of grant applications and to safeguard the original ideas embodied in grant applications. To allow peer reviewers to ignore the rule of confidentiality would effectively endorse unethical behaviors (such as misappropriation/theft, misrepresentation, fraud, plagiarism, and copyright infringement). Ultimately, disregard of confidentiality in the peer review process could mean the demise of the peer review process.

Inspiration for this case: Wesseley, S. (1998). Peer review of grant applications: What do we know? The Lancet, 352(9124), 301–305.

CASE NO. 4: MICHAEL—INFLATING THE RECORD

Michael, a doctoral student, was an aspiring language scientist. He worked in a laboratory in which the competition among graduate students was extreme, and students were required to submit their own NIH grant applications. In his application, Michael listed 11 manuscripts as "accepted" or "in press." All had been rejected for publication, but he fully expected that, with a few revisions, they would eventually published. A year later, Michael had received an NIH grant. He was ready to move on to an academic position. Coincidentally, one of the members of the search committee had been a reviewer on Michael’s successful NIH grant application. He wanted to share Michael’s publications with the search committee, but could not find them in MEDLINE or any other database. When he asked Michael for reprints, Michael stated they were still "in press." Unfortunately, Michael could not produce editors’ letters of acceptance, and the search committee ultimately declined to invite him for an interview.

Analysis:

  • Issue: Whether manuscripts submitted for publication may be cited as "accepted" or "in press" in a grant application.
  • Legitimate expectations: Readers (e.g., funding agencies, search committees) expect scientists’ curricula vitae to be accurate and truthful.
  • Consequences: A grant applicant’s misrepresentation of his manuscripts as "accepted" or "in press" when they have been submitted for publication but not yet accepted, inflates the applicant’s credentials, undermines the competitive process, and misleads peer reviewers.
  • Obligations: Students, faculty, and other researchers have a responsibility to be truthful in all aspects of their research and scholarly activities, including reporting their credentials and scholarly record accurately, because the peer review process and ultimately the integrity of the scholarly record of the discipline depend on honest reporting.

Case adapted from: Office of Research Integrity (U.S. Department of Health and Human Services). (2001). Case Summaries, Michael K. Hartzer. ORI Newsletter, 9(2), 6. 



Select Reading List

Altman, L. K. (1996). The Ingelfinger rule, embargoes, and journal peer review—Part I. The Lancet, 347, 1382–1386.

American Speech-Language-Hearing Association.Code of Ethics. Available online: www.asha.org.

American Journal of Law & Medicine. (1998). Horner, J. (Symposium Editor), Law, medicine and socially responsible research (Symposium Issue), 24 (2&3). Boston, MA: American Society of Law, Medicine & Ethics and Boston University School of Law.

Angell, M. (2000). Is academic medicine for sale? (Editorial). New England Journal of Medicine, 342, 1516–1518.

Angell, M., & Kassirer, J. P. (1996). Editorials and conflicts of interest. New England Journal of Medicine, 335, 1055–1056.

Bailar, J. C. (1986). Science, statistics, and deception. Annals of Internal Medicine, 104, 259–260.

Beauchamp, T. L., & Childress, J. F. (1994). Principles of biomedical ethics (4th ed.). New York: Oxford University Press.

Bebeau, M. J., Pimple, K. D, Muskavitch, K. M. T., Borden, S. L., & Smith, D. H. (1995). Moral reasoning in scientific research. Bloomington, IN: Indiana University. Available online: http://www.poynter.indiana.edu/mr.pdf [site visited 01/09/04].

Berns, K. I., & Manning, F. J. (Eds.). (1996). Committee on Resource Sharing in Biomedical Research, Division of Health Sciences Policy, Institute of Medicine. Resource sharing in biomedical research. Washington, DC: National Academy Press. Available online: http://stills.nap.edu/html/biomed [site visited 01/09/04].

Bingham, C. (1998). Peer review on the Internet: A better class of conversation. The Lancet, 351, S110–S114.

Blumenthal, D., Campbell, E. G., Causino, N., & Louis, K. S. (1996). Participation of life-science faculty in research relationships with industry. New England Journal of Medicine, 335, 1734–1739.

Blumenthal, D., Causino, N., Campbell, E., & Louis, K. S. (1996). Relationships between academic institutions and industry in the life sciences—An industry survey. New England Journal of Medicine, 334(6), 368–373.

Burk, D. L. (1995). Research misconduct: Deviance, due process, and the disestablishment of science. George Mason Independent Law Review, 3, 305.

Callahan, E. S., & Dworkin, T.M. (2000). The state of state whistleblower protection. The American Business Law Journal, 38, 99.

Chalmers, I. (1999, Dec. 20). When silence is lethal. Chemistry and Industry, 961.

Chung, A. W. (2001). Resuscitating the constitutional “theory” of academic freedom: A search for a standard beyond Pickering and Connick. Stanford Law Review, 53, 915.

Chung, J. (2001). Does simultaneous research make an invention obvious? The 35 U.S.C. s 103 nonobvious requirement for patents as applied to the simultaneous research problem. Albany Law Journal of Science & Technology, 11, 337.

Committee on Science, Engineering, and Public Policy, National Academy of Sciences, National Academy of Engineering & Institute of Medicine. (1995). On being a scientist: Responsible conduct in research. Washington, DC: National Academy Press.

Committee on Science, Engineering, and Public Policy, National Academy of Sciences, National Academy of Engineering & Institute of Medicine. (1997). Adviser, teacher, role model, friend: On being a mentor to students in science and engineering. Washington, DC: National Academy Press.

Court, C., & Dillner, L. (1994). Obstetrician suspended after research inquiry. BMJ, 309, 1459.

Cretsinger, C. E., & Menell, P. S. (2001). Annual review of law and technology: Foreword. Berkeley Technology Law Journal, 16, 1.

Culliton, B. J. (1990). Inside the Gallo probe. Science, 448, 1494–1498.

Dartmouth College. (1998). Sources: Their use and acknowledgment. Hanover, NH: Trustees of Dartmouth College. Available online: www.dartmouth.edu/~sources [site visited 01/09/04].

Dingell, J. D. (1993). Misconduct in medical research (Shattuck Lectures). New England Journal of Medicine, 328, 1610.

Dreyfuss, R. C. (2000). Collaborative research: Conflicts on authorship, ownership, and accountability. Vanderbilt Law Review, 53, 1162.

Eastwood, S., Derisk, P., Leash, E., & Ordway, S. (1996). Ethical issues in biomedical research: Perceptions and practices of postdoctoral research fellows responding to a survey. Science and Engineering Ethics, 2, 89–114.

Elliott, D., & Stern, J. E. (1997). Research ethics: A reader. Hanover, NH: University Press of New England.

Ferguson, J. R. (1997). Biomedical research and insider trading (Sounding Board). New England Journal of Medicine, 337, 631–634.

Fotion, N., & Conrad, C. C. (1984). Authorship and other credits. Annals of Internal Medicine, 100, 592–594.

Friedberg, M., Saffran, B., Stinson, T. J., Nelson, W., & Bennett, C. L. (1999). Evaluation of conflict of interest in economic analyses of new drugs used in oncology. JAMA, 282, 1453.

Garry, R. F. (1999). BMJ’s editors should publish their own conflicts of interest regularly (Letter). BMJ, 318, 464.

Goldner, J. A. (1998). The unending saga of legal controls over scientific misconduct: A clash of cultures needing resolution. American Journal of Law & Medicine, 24, 293–343.

Goodman, N. W. (1994). Survey of fulfillment of criteria for authorship in published medical research. BMJ, 309, 1482.

Grodin, M. A. (Ed.). (1995). Meta medical ethics: The philosophical foundations of bioethics. Boston: Kluwer.

Hamilton, J. B., Greco, A. J., & Tanner, J. R. (1997). Ethical questions regarding joint authorship: Business and nonbusiness faculty perceptions on noncontributing authorship. Journal of Education for Business, 72(6), 325–330.

Heitman, E. (2000, July). Ethical values in the education of biomedical researchers. The Hastings Center Report, 30, S40.

Hixson, J. (1976). The patchwork mouse. Garden City: Anchor Press/Doubleday.

Horner, J. (2003). Morality, ethics, and law: Introductory concepts. In ethical, moral and legal issues in speech and language pathology. In N. Helm-Estabrooks & N. Bernstein Ratner (Eds.) & J. Horner (Guest Ed.), Seminars in Speech and Language, 24(4), 263–274.

Horton, R. (1999). Scientific misconduct: Exaggerated fear but still real and requiring a proportionate response. The Lancet, 354, 7.

Horton, R., & Smith, R. (1996). Time to redefine authorship (Editorial). BMJ, 312, 723.

Howard, E. (1994). Science misconduct and due process: A case of process due. Hastings Law Journal, 45, 309.

Huston, P., & Moher, D. (1996). Redundancy, disaggregation, and the integrity of medical research. The Lancet, 347, 1024–1026.

Huth, E. J. (1986). Irresponsible authorship and wasteful publication. Annals of Internal Medicine, 104, 257–259.

Ingham, J. C. (2003). Research ethics 101: The responsible conduct of research. In Ethical, moral and legal issues in speech and language pathology. In N. Helm-Estabrooks & N. Bernstein Ratner (Eds.) & J. Horner (Guest Ed.), Seminars in Speech and Language, 24(4), 323–337.

International Committee of Medical Journal Editors. (1993). Uniform requirements for manuscripts submitted to biomedical journals. JAMA, 269, 2282–2286.

Kalichman, M. W. The online resource for instruction in the responsible conduct of research. Available online: http://ethics.ucsd.edu/resources/resources-training.html.

Kassirer, J. P. (1993). The frustrations of scientific misconduct. New England Journal of Medicine, 328, 1634–1636.

Kulynych, J. (1998). Intent to deceive: Mental state and scienter in the new uniform federal definition of scientific misconduct. Stanford Technology Law Review. Available online: http://stlr.stanford.edu/STLR/Articles/98_STLR_2/ [site visited 02/02/04].

Kuzma, S. M. (1992). Criminal liability for misconduct in scientific research. University of Michigan Journal of Law Reform, 25, 357.

LaFollette, M. C. (1992). Stealing into print: Fraud, plagiarism, and other misconduct in scientific publishing. Berkeley: University of California Press.

LaFollette, M. C. (1994). The pathology of research fraud: The history and politics of the U.S. experience. Journal of Internal Medicine, 235, 129–135.

Leach, R. E. (2001). Honesty—A fundamental trait. The American Journal of Sports Medicine, 29, 1.

Line, S. (1993). Scientific misconduct: A form of white coat crime. The Journal of Pharmacy & Law, 2, 15.

List, C. J. (1984, Fall.). Scientific fraud: Social deviance or the failure of virtue? Science, Technology & Human Values, 10, 27–36.

Lundberg, G. D., & Flanagin, A. (1989). New requirements for authors: Signed statements of authorship responsibility and financial disclosure. JAMA, 262, 2003–2004.

McGarity, T. (1994). Peer review in awarding federal grants in the arts and sciences. High Technology Law Journal, 9, 1–92.

Mertelsmann, R. (2001). Haematologist may face disciplinary action for research fraud. BMJ, 322, 694.

National Bioethics Advisory Commission. (2001). Ethical and policy issues in research involving human participants. (Draft Report). Bethesda, MD, December 19, 2000. Final recommendations, May 18, 2001. Available online: http://bioethics.gov/#final [site visited 01/09/04].

Nelkin, D. (1996). An uneasy relationship: The tensions between medicine and the media. The Lancet, 347, 1600–1603.

Office of Research Integrity, U.S. Department of Health and Human Services. ORI provides working definition of plagiarism. Available online: http://ori.dhhs.gov/policies/plagiarism.shtml.

Office of Research Integrity, U.S. Department of Health and Human Services. (1999). Guidelines for editors. ORI Newsletter, 7(3), 1–2.

Office of Research Integrity, U.S. Department of Health and Human Services. (1999). On the duty of faculty members to speak out on misconduct. ORI Newsletter, 7(3), 4.

O’Reilly, J. T. (2000). Elders, surgeons, regulators, jurors: Are medical experimentation’s mistakes too easily buried? Loyola University Chicago Law Journal, 31, 317.

Panel on Scientific Responsibility and the Conduct of Research (Committee on Science, Engineering, and Public Policy [COSEPUP], sponsored by the National Academy of Sciences, National Academy of Engineering, and Institute of Medicine). (1992). Responsible science: Ensuring the integrity of the research process (Vol. I). Washington, DC: National Academy Press.

Pascal, C. B. (2000). Scientific misconduct and research integrity for the bench scientist. Society for Experimental Biology and Medicine, 224, 220–230.

Penslar, R. L. (Ed.). (1995). Research ethics: Cases & materials. Bloomington: Indiana University Press.

Pfeifer, M. P., & Snodgrass, G. L. (1990). The continued use of retracted, invalid scientific literature. JAMA, 263(10), 1420–1423.

Prentice, R. A. (1999). Clinical trial results, physicians, and insider trading. Journal of Legal Medicine, 20, 2.

Ramsay, S. (2001). UK consultant censured for failure to act on junior’s research fraud. The Lancet, 357, 780.

Relman, A. S. (1983). Lessons from the Darsee affair. New England Journal of Medicine, 308, 1415–1417, cited by LaFollette, M. C. (1994). The pathology of research fraud: The history and politics of the U.S. experience. Journal of Internal Medicine, 235, 130.

Relman, A. S. (1979). An open letter to the news media. New England Journal of Medicine, 300, 554–555.

Rennie, D., & Flanagin, A. (1994). Authorship! authorship! guests, ghosts, grafters, and the two-sided coin. JAMA, 271, 469–471.

Resnik, D. B. (1998). Ethics of science: An introduction. London: Routledge.

Riesenberg, D., & Lundberg, G. D. (1990). The order of authorship: Who’s on first? (Editorial). JAMA, 264(14), 1857, citing Lundberg, G. D. & Flanagin, A. (1989). New requirements for authors: Signed statements of authorship responsibility and financial disclosure. JAMA, 262, 2003–2004.

Rosenberg, S. A. (1996). Secrecy in medical research (Sounding Board). New England Journal of Medicine, 334(6), 392–394.

Shalala, D. (2000). Protecting research subjects—What must be done? (Sounding Board). New England Journal of Medicine, 343(11), 808–810.

Shamoo, A. E., & Resnik, D. B. (2003). Responsible conduct of research. New York: Oxford University Press.

Shapiro, D. W., Wenger, N. S., & Shapiro, M. F. (1994). The contribution of authors to multiauthored biomedical research papers. JAMA, 271, 438–442.

Sheikh, A. (2000). Publication ethics and the research assessment exercise: Reflections on the troubled question of authorship. Journal of Medical Ethics, 26(6), 422–427.

Smith, J. (1994). Gift authorship: A poisoned chalice? (Editorial). BMJ, 309, 1456–1457.

Smith, R. (1997). Authorship: Time for a paradigm shift? (Editorial). BMJ, 314, 992.

Smith R. (1998). Beyond confidential: Transparency is the key. BMJ, 317, 291–292.

Spencer, F. (1990). Piltdown: A scientific forgery. New York: Oxford University Press.

Stern, J. E., & Elliott, D. (Eds.). (1997). Research ethics: A reader. Hanover, NH: University Press of New England.

U.S. Department of Health and Human Services. (1989). Responsibility of PHS awardee and applicant institutions for dealing with and reporting possible misconduct in science. Federal Register 54 (August 8), 32449, codified as 42 CFR Part 50 Subpart A.

Wesseley, S. (1998). Peer review of grant applications: What do we know? The Lancet, 352, 301–305.

Wilmhurst, P. (1997). The code of silence. The Lancet, 349, 457.

Woodward, B. (1999). Challenges to human subject protections in US medical research. JAMA, 282, 1947.



  

Advertise With UsAdvertisement