Use of Generative Artificial Intelligence (AI) in our grant application process

Generative Artificial Intelligence (AI) tools offer potential benefits for research but also present challenges and risks. Wellcome aims to ensure the responsible conduct of research and is a signatory to the Funders joint statement on the use of generative AI tools. We require the use of generative AI tools to be declared when applying for grant funding, with some exceptions as outlined in this policy.

Last updated
Last updated

Definitions 

Generative AI is a type of artificial intelligence which can be used to create new content (for example, text, code, images, videos or music referred to as output). AI uses machine learning algorithms to model and to train on large data sets.  

Hallucinations are outputs which may initially appear to be believable but are in fact highly inaccurate or fabricated, including, but not limited to, references that sound plausible but don’t exist.

What we expect from applicants and organisations 

Applicants for Wellcome funding can use generative AI tools when preparing a grant application but must tell us about any substantive use of generative AI on their application form. Examples of substantive use include:

  • assisting idea generation or development of hypotheses
  • interpreting, analysing or summarising data (including audio, image files) and text
  • comparing literature sources
  • generating or refining code
  • generating an abstract

Applicants are not required to disclose any minimal use of generative AI tools on their application form. Examples of minimal use include:

  • translating applications from other languages to English 
  • improving the standard of English
  • formatting or reducing word count

Applicants must not use generative AI tools:

  • for uses that would breach our standard research integrity expectations. Examples include:
    • fabricating, falsifying, manipulating, duplicating, altering original data, images or results 
    • plagiarism – using other people’s ideas, intellectual property or work without acknowledgement or permission
    • breaching data protection rules or intellectual property rights
    • misrepresenting or falsifying information, for example, qualifications gained
    • faking citations or references
  • to generate an entire application, or sections of an application, without human involvement.
  • without disclosing their substantive use on our application form.

We collect information on the use of generative AI in grant applications for monitoring purposes. Anything you tell us will not be used in the assessment process. Any concerns raised that generative AI has been misused will be investigated.

Applicants are responsible for the content of their applications. They must:

  • be honest, transparent and accountable about the use of generative AI tools in their application
  • protect confidential information and personal data by not inputting it into generative AI tools
  • ensure the outputs from generative AI tools used in their applications are valid (this includes removing any false, misleading or hallucinated information)
  • be aware of third-party intellectual property rights (where the rights to the material belong to someone else) when using AI tools in an application, and only use information from third-party sources with consent from the relevant copyright holder

Organisations must provide guidance to their staff and students on promoting responsible research practice, including how to use AI in line with principles of research integrity and our policy.

What we expect from our expert reviewers and committee members 

Expert reviewers and advisory committee members must not input our confidential grant applications into generative AI tools. Doing this breaches the confidentiality agreement that expert reviewers and advisory committee members agree to.

Reviewers may only use generative AI tools for side tasks if they do not input our confidential grant applications. Examples of side tasks include: 

  • checking if research has previously been carried out
  • sourcing existing knowledge, methods or techniques
  • language refinement

Reviewers must ensure the outputs from generative AI tools used are valid. This includes removing any false, misleading or hallucinated information.

Wellcome’s use of generative AI tools when managing grant applications 

We do not use generative AI tools to assess the quality of grant applications or to aid funding decisions.

Wellcome staff have access to a limited number of enterprise data protected (EDP) generative AI tools. These tools have been assessed for security and safety, and information from grant applications may be entered into them for processing purposes, for example, to assess scheme eligibility or remit.

Other resources