Generative AI Tools or Chatbots:
Artificial Intelligence (AI) empowers machines, such as computers and robots, to execute tasks that traditionally require human cognition.[1] Within this domain, Machine Learning (ML) and Deep Learning (DL) represent key subsets, with DL specifically leveraging multi-layered Artificial Neural Networks (ANN) for complex data analysis. This architecture enables the modelling of intricate patterns and representations, thereby optimizing performance across various applications in fields such as computer vision and natural language processing.
A Chatbot is an AI-driven application that leverages a combination of predefined rules, Natural Language Processing (NLP), and Machine Learning (ML) algorithms to facilitate interactions and respond to user inquiries across various contexts.[2]
Generative Modelling is an advanced artificial intelligence approach that generates synthetic artefacts through a deep analysis of training datasets. By identifying and understanding the underlying patterns and statistical distributions within the data, generative models can create highly realistic replicas or samples that closely mimic the original data characteristics.[3]
Generative AI (GAI) utilizes generative modelling and advancements in Deep Learning (DL) to generate diverse content at scale by leveraging existing media, encompassing text, graphics, audio, and video. The capabilities of GAI tools are extensive, allowing for the creation of various forms of content, including text generation, image synthesis, audio production, and the generation of synthetic data. Prominent examples of such tools include ChatGPT, Copilot, Gemini, Claude, NovelAI, Jasper AI, DALL-E, Midjourney, and Runway. The potential of generative AI to enhance creativity among authors is substantial. Nonetheless, it is essential to acknowledge the inherent risks associated with the current generation of Generative AI tools or chatbots. These risks encompass concerns related to confidentiality and intellectual property, insufficient attribution practices, issues of accuracy and bias, as well as the potential for unintended applications.[4]
Generative AI or chatbots can significantly enhance the writing process by supporting content creation and narrative consistency. However, it is advisable to refrain from using these tools for data analysis and insight extraction during research, as this may compromise the accuracy and validity of the results. Authors must specify the Generative AI or chatbots used in the Materials and Methods section of their articles and are responsible for upholding the integrity and quality of their work while adhering to ethical standards and publication guidelines.[4][5]
Authors bear the responsibility for maintaining the originality, validity, and integrity of their submissions to the journal. When utilizing generative AI Tools, authors must comply meticulously with the journal’s editorial and publishing policies regarding authorship, misconduct, and overall publishing guidelines.
Authors can leverage Generative AI tools or chatbots to enhance their writing workflows while avoiding their use for data analysis and insight generation during the research phase.
Authors using Generative AI tools or chatbots for writing, visual content, or data analysis must disclose the specific AI tools in the Materials and Methods section of their articles. They remain fully accountable for the integrity and content of their work, ensuring compliance with publication ethics and standards.[4][5]
-
- Generative AI tools or chatbots cannot be considered authors or co-authors.
- Authors should be transparent in their use of Generative AI Tools or Chatbots, detailing their application and disclosing the tools in the materials and methods section of the article.
- Authors must ensure the accuracy of content generated by Generative AI Tools or Chatbots, prevent plagiarism, and properly cite all sources used.
For Editors and Peer Reviewers:
Editors and reviewers serve as essential stewards of research quality and integrity, upholding the confidentiality of both submissions and peer review processes. Their meticulous oversight is crucial in mitigating risks associated with the use of articles in Generative AI tools or chatbots. Such practices can potentially result in privacy violations, infringement of intellectual property rights, and various other critical concerns.
Editors and reviewers are expressly prohibited from inputting any files, images, or data of unpublished articles into generative AI tools or chatbots. This prohibition is not merely a policy directive but a vital safeguard for the intellectual property rights of the original authors. The significance of adhering to this guideline is paramount, underscoring the serious obligations that come with handling sensitive scholarly material.
Editors and reviewers are required to utilize only the specified and endorsed tools when evaluating the articles. For additional insights, please refer to the guidelines that outline the responsibilities of both editors and reviewers.
References:
- Copeland BJ (fact-checked by the Editors of Encyclopaedia Britannica). Artificial intelligence. Britannica Last Updated: May 26, 2023). https://www.britannica.com/technology/artificial-intelligence Accessed May 27, 2023.
- What is a chatbot? Oracle Cloud Infrastructure. https://www.oracle.com/chatbots/what-is-a-chatbot/Accessed May 27, 2023.
- Gui J, Sun Z, Wen Y, Tao D, and Ye J. A review on generative adversarial networks: Algorithms theory and applications. IEEE Trans Knowl Data Eng. 2023; 35:3313-3332. DOI: 1109/TKDE.2021.3130191
- Zielinski C, Winker MA, Aggarwal R, Ferris LE, Heinemann M, Lapeña JF, Pai SA, Ing E, Citrome L, Alam M, Voight M, Habibzadeh F, for the WAME Board. Chatbots, Generative AI, and Scholarly Manuscripts. WAME Recommendations on Chatbots and Generative Artificial Intelligence in Relation to Scholarly Publications. WAME. May 31, 2023. https://wame.org/page3.php?id=106
- COPE Council. COPE position – Authorship and AI – English. https://doi.org/10.24318/cCVRZBms
- Adheres to the Editorial and Publishing Policies of Lattice Science and Publication (LSP)