AI in the Review Process
The DFG permits the use of artificial intelligence (AI) in the preparation of reviews. However, such use is only permitted subject to clearly defined conditions, which are set out in a guideline. Reviewers may use AI systems for support purposes at individual stages of their work, for example in structuring their own notes, editing language and screening publicly available literature. However, the academic analysis, evaluation and recommendation remain the responsibility of the reviewer.
- Guideline on the Use of A(externer Link)
- FA(externer Link)
- White Paper: AI in the Review Process – DFG Position and Perspective(externer Link)
- Contac(externer Link)
What’s New
An issue of Information for Researcher(interner Link) have been published by the DFG containing the rules governing the use of AI in the review process.
On 22 April 2026, from 10:30 to 12:00, the DFG will host an online information event on the topic of “AI in the review process”. The login data will be provided here.
The DFG regards the AI systems listed below as being currently compatible with the guideline (as of March 2026). The aim of this approved list is to provide guidance for reviewers. The list will not be updated on an ongoing basis.
- Locally hosted AI systems, i.e. AI systems installed on the reviewer’s own device where processing takes place exclusively on that device, without cloud computing or the transfer of processing results to the cloud. Example: https://ollama.co(externer Link)
- AI systems hosted by a trustworthy institution, e.g. a university or a public research institution. Example: http://kiconnect.nrw(externer Link)
- Cloud-based (commercial) systems subject to a contractual licence agreement, i.e. AI systems for which a contractual licence agreement is in place guaranteeing data security, excluding use of the input data by the provider (e.g. for training purposes) and ensuring that these conditions are actively set by means of the appropriate security settings (e.g. “no logging”; “no storage”).
Guideline
The DFG has adopted a guideline which sets down four principles governing the use of AI in the review process. The guideline is available here:
Principles
See here for a brief overview of the purposes for which AI systems may be used and the four principles set out in the guideline:
Human analysis and the expert assessment of the proposal are core elements of the review process. The use of AI systems in the preparation of reviews is therefore permitted only in a supporting capacity. All substantive judgements regarding the funding recommendation rest solely with the reviewer.
AI systems may be used only where care is taken to ensure that confidential content of funding proposals is not processed without oversight, stored, or used for purposes other than the conduct of the review process. The supportive use of AI in drafting a review must not result in the disclosure of confidential proposal content. This can be ensured by using the following types of system:
- Locally hosted systems without cloud computing
- Systems hosted by a trustworthy institution (e.g. a university or public research institution)
- Cloud-based (commercial) systems subject to a contractual licence agreement guaranteeing data security
If an AI system is used in preparing the review, this must be disclosed, specifying the purposes of use.
Uncritical adoption of AI-generated text is not permitted. AI-generated text must always be critically assessed, adapted and checked for factual accuracy, currency and potential bias structures.
Full responsibility for the content of the review lies with the person who prepared it. This applies even where parts of the text were generated with the assistance of AI systems. Responsible expert review requires subject expertise and ethical judgement, which by their very nature AI systems do not possess. Responsibility must therefore not be transferred to an AI system; it remains with the reviewer.
FAQ
The digital turn is also impacting on the way research is conducted, including its quality assurance. AI systems offer new possibilities for information processing and text support which, when used responsibly, can help ease workloads. For this reason, the DFG has decided to permit the use of AI systems in the review process – subject to clearly defined conditions. The underlying principles here are confidentiality, transparency, quality assurance and responsibility. These are set out in detail in a binding guideline on the use of AI in the review process.
AI systems may only be used in a supporting capacity. This includes, for example:
- searching for relevant academic literature,
- linguistic or structural revision of one’s own draft texts,
- expanding one’s own bullet points or notes,
- formal checks (e.g. clarity, readability).
Human analysis and the expert assessment of the proposal are core elements of the review process. The use of AI systems in the preparation of reviews is therefore permitted only in a supporting capacity. All substantive judgements regarding the funding recommendation rest solely with the reviewer. Further details are set out in the Guideline on the Use of AI in the Review Proces(interner Link). For example, it is not permissible to have substantial parts of the review generated automatically and without reflection, i.e. without independent expert assessment.
When registering in elan, you confirm that if you use AI in the review process, you have sufficient AI literacy. The DFG relies on your self-assessment in this regard. There is no absolute standard here. Key elements of AI literacy include a basic technical understanding of how AI systems function, awareness of their limitations such as hallucinations and bias, knowledge of how AI systems can be used in the review process, and, closely related to this, a clear understanding of one’s own role in the epistemic evaluation process (see also the Guideline on the Use of AI in the Review Proces(interner Link)). For a useful summary of AI literacy in general, see the paper Hinweispapier KI-Kompetenzen nach Artikel 4 KI-Verordnun(externer Link) [AI literacy under Article 4 of the AI Regulation], issued by the Federal Network Agency. A wide range of free online training courses are available on the KI Campu(externer Link) website operated by Stifterverband für die Deutsche Wissenschaft e.V.
If the information transmitted to an AI system includes personal data, its use constitutes data processing within the meaning of the General Data Protection Regulation (GDPR). Since this processing is carried out on behalf of the controller, Article 28 GDPR requires the conclusion of a data processing agreement (DPA). Such agreements can and must be concluded with the respective provider as part of the licensing arrangement. Where you use an AI system licensed by your university or research institution, this is the responsibility of the institution. If you use an AI system hosted exclusively locally at your university or research institution, there is no requirement to conclude a data processing agreement (DPA), as in this case no processor is involved.
White paper
The DFG has issued a white pape(externer Link) summarising its internal process of clarification regarding the use of AI systems in the review process.
Contact
| E-mail: | KI@dfg.de |