As AI tools become more integrated into the research process, transparency about how they’re used is essential. Clear disclosure supports research integrity, builds trust, and ensures compliance with ethical, institutional, and publishing standards. Generative AI tools, in particular, can shape the tone, content, and structure of research outputs and may introduce bias or inaccuracies. Without proper disclosure, it can be difficult for reviewers, collaborators, or the public to understand how AI influenced the research.When using AI tools in research or writing:Name the tool and purpose (e.g. summarising literature, drafting, translation).Provide a basic audit trail (e.g. example prompts or usage summary).Confirm human oversight (e.g. whether outputs were reviewed or edited).Check funder and publisher policies early in your workflow, as expectations vary.Many publishers and funders provide specific guidance on using AI in research. Some require disclosure statements or audit trails, while others prohibit sharing confidential or unpublished work with AI tools. If no clear guidance is available, researchers should contact the relevant organisation directly for clarification. Core rules for AI use in academic publishing and funding applications AI cannot be an authorNo publisher allows AI to be listed as an author or co-authorAuthorship requires human accountability and responsibilityEven if AI contributed significantly to text, humans must take full responsibilityDistinguish between assistive and generative AI useAssistive AI (grammar checking, language polishing): Generally does not require disclosureGenerative AI (content creation, substantial editing): Always requires disclosureWhen in doubt, disclose AI use to maintain transparencyDisclose generative AI use properlyInclude in methods or acknowledgements section (check journal preference)Specify the name and version of the AI tool usedExplain how and why it was used in your research processSome journals require detailed documentation of prompts and outputsProtect confidentialityPeer reviewers must not input manuscripts they're reviewingDon't share confidential research data or funding applications with AIBe aware: Text entered into Generative AI tools may be stored or used for AI trainingFollow strict Image policiesMost publishers prohibit using AI to create or manipulate research images/figuresLimited exceptions exist only when AI imaging is itself the research subjectWhen permitted, AI-generated images must be clearly labeledCheck Terms of UseEnsure AI tools don't claim ownership of your contentVerify that using the tool doesn't compromise your intellectual property rightsConsider privacy implications of data entered into AI systemsAlways check specific guidelinesIndividual journals or funding agencies may have their own specific requirementsSome require submission of AI prompts and outputs as supplementary materials What publishers exactly said about AI use What UK funders said about AI use Different publishers may have different requirements on Disclosure Statement. Below is an example recommended by Elsevier:We ask authors who have used generative AI or AI-assisted tools to insert a statement at the end of their manuscript immediately above the references or bibliography entitled ‘Declaration of generative AI and AI-assisted technologies in the writing process’. In that statement, we ask authors to specify the tool that was used and the reason for using the tool. We suggest that authors follow this format when preparing their statement:During the preparation of this work the author(s) used [NAME TOOL / SERVICE] in order to [REASON]. After using this tool/service, the author(s) reviewed and edited the content as needed and take(s) full responsibility for the content of the publication. This article was published on 2025-06-27