AI in the Review Process
The DFG permits the use of artificial intelligence (AI) in the preparation of reviews. However, such use is only permitted subject to clearly defined conditions, which are set out in a guideline. Reviewers may use AI systems for support purposes at individual stages of their work, for example in structuring their own notes, editing language and screening publicly available literature. However, the academic analysis, evaluation and recommendation remain the responsibility of the reviewer.
What’s New
The DFG regards the AI systems listed below as being currently compatible with the guideline (as of March 2026). The aim of this approved list is to provide guidance for reviewers. The list will not be updated on an ongoing basis.
- Locally hosted AI systems, i.e. AI systems installed on the reviewer’s own device where processing takes place exclusively on that device, without cloud computing or the transfer of processing results to the cloud. Example: (externer Link)
- AI systems hosted by a trustworthy institution, e.g. a university or a public research institution. Example: (externer Link)
- Cloud-based (commercial) systems subject to a contractual licence agreement, i.e. AI systems for which a contractual licence agreement is in place guaranteeing data security, excluding use of the input data by the provider (e.g. for training purposes) and ensuring that these conditions are actively set by means of the appropriate security settings (e.g. “no logging”; “no storage”).
Guideline
The DFG has adopted a guideline which sets down four principles governing the use of AI in the review process. The guideline is available here:
Principles
See here for a brief overview of the purposes for which AI systems may be used and the four principles set out in the guideline:
Human analysis and the expert assessment of the proposal are core elements of the review process. The use of AI systems in the preparation of reviews is therefore permitted only in a supporting capacity. All substantive judgements regarding the funding recommendation rest solely with the reviewer.
AI systems may be used only where care is taken to ensure that confidential content of funding proposals is not processed without oversight, stored, or used for purposes other than the conduct of the review process. The supportive use of AI in drafting a review must not result in the disclosure of confidential proposal content. This can be ensured by using the following types of system:
- Locally hosted systems without cloud computing
- Systems hosted by a trustworthy institution (e.g. a university or public research institution)
- Cloud-based (commercial) systems subject to a contractual licence agreement guaranteeing data security
If an AI system is used in preparing the review, this must be disclosed, specifying the purposes of use.
Uncritical adoption of AI-generated text is not permitted. AI-generated text must always be critically assessed, adapted and checked for factual accuracy, currency and potential bias structures.
Full responsibility for the content of the review lies with the person who prepared it. This applies even where parts of the text were generated with the assistance of AI systems. Responsible expert review requires subject expertise and ethical judgement, which by their very nature AI systems do not possess. Responsibility must therefore not be transferred to an AI system; it remains with the reviewer.
FAQ
The digital turn is also impacting on the way research is conducted, including its quality assurance. AI systems offer new possibilities for information processing and text support which, when used responsibly, can help ease workloads. For this reason, the DFG has decided to permit the use of AI systems in the review process – subject to clearly defined conditions. The underlying principles here are confidentiality, transparency, quality assurance and responsibility. These are set out in detail in a binding guideline on the use of AI in the review process.
AI systems may only be used in a supporting capacity. This includes, for example:
- searching for relevant academic literature,
- linguistic or structural revision of one’s own draft texts,
- expanding one’s own bullet points or notes,
- formal checks (e.g. clarity, readability).
Human analysis and the expert assessment of the proposal are core elements of the review process. The use of AI systems in the preparation of reviews is therefore permitted only in a supporting capacity. All substantive judgements regarding the funding recommendation rest solely with the reviewer. Further details are set out in the (interner Link). For example, it is not permissible to have substantial parts of the review generated automatically and without reflection, i.e. without independent expert assessment.
The DFG will set out in detail the technical and legal requirements that must be met. These are considered to be met in particular when an AI system is used that is operated or licensed by the reviewer’s own higher education institution or research organisation.
When registering in elan, you confirm that if you use AI in the review process, you have sufficient AI literacy. The DFG relies on your self-assessment in this regard. There is no absolute standard here. Key elements of AI literacy include a basic technical understanding of how AI systems function, awareness of their limitations such as hallucinations and bias, knowledge of how AI systems can be used in the review process, and, closely related to this, a clear understanding of one’s own role in the epistemic evaluation process (see also the (interner Link)). For a useful summary of AI literacy in general, see the paper (externer Link) [AI literacy under Article 4 of the AI Act], issued by the Federal Network Agency. A wide range of free online training courses are available on the (externer Link) website operated by Stifterverband für die Deutsche Wissenschaft e.V.
Since the DFG has decided to obtain the required legal declarations centrally, it may occur that you are asked to provide (interner Link) that are not yet relevant to you or may never become relevant (for example if you have been asked to review a proposal, are based abroad and do not submit proposals to the DFG yourself). It is important to note that accepting the provision on usage rights does not in itself grant any usage rights. Such rights are granted only if and when a proposal is submitted. The legal declaration serves only to clarify this point. If, for example, you act “only” as a reviewer for the DFG, the accepted provision on usage rights has no effect on you as long as you do not submit a proposal to the DFG.
If the information transmitted to an AI system includes personal data, its use constitutes data processing within the meaning of the General Data Protection Regulation (GDPR). Since this processing is carried out on behalf of the controller, Article 28 GDPR requires the conclusion of a data processing agreement (DPA). Such agreements can and must be concluded with the respective provider as part of the licensing arrangement. Where you use an AI system licensed by your university or research institution, this is the responsibility of the institution. If you use an AI system hosted exclusively locally at your university or research institution, there is no requirement to conclude a data processing agreement (DPA), as in this case no processor is involved.
Explainer video: DFG Guideline on the Use of AI in the Review Process
By clicking on "Play" you agree that data will be transmitted to Google. For further information on data processing by Google, please refer to (externer Link). Information on processing by DFG and your right to revoke your declaration of consent can be found in our (interner Link).
White paper
The DFG has issued a (externer Link) summarising its internal process of clarification regarding the use of AI systems in the review process.
Contact
| E-mail: | KI@dfg.de |