
The rise of Artificial Intelligence (AI) tools is becoming increasingly embedded in our lives. Tools like ChatGPT, Gemini and Copilot are changing the way people work by saving them time and improving accessibility, as with most things there are both positive and negative to using these tools and this short piece is here to help provide some guidance to groups who may be considering its use for future funding applications.
"What is the position on the use of AI from funders?”
With so many potential funders out there, each one will be different, at the time of writing The National Lottery Community Fund does allow the use AI tools to help write funding applications and will not reject an application just because AI was used.
AI tools can support you if English is not your first language or if you’re new to writing funding applications. Many groups have reported that using AI has helped them write their applications faster and with less effort. Translation apps and tools have also helped to enable applications from groups who typically don’t use or have additional needs to write in English. This has played a big part in seeing an increase in applications from a wider and more diverse groups.
Things to consider when using AI
AI can provide a useful starting point but often what it produces for you is not as strong as it might appear.
AI supported applications do not tell the unique story of your community and how you want to support them. Being too generic in content may disadvantage your application. As AI learns from other users it’s likely to draw in popular buzzwords within its algorithm, this may not be applicable to your group or the project you have in mind. It may also suggest you will be delivering services outside of your groups main core activities or values, so ensuring you check and edit what has been produced is essential before hitting the submit button on your application. Saving your application and meeting your local GAVO Community Development Officer to allow them to be that extra set of eyes over an application will also always help.

Risks when using AI
Look out for inaccuracies
AI can sometimes generate incorrect or misleading information.
To ensure accuracy use trusted sources for data and research.
Fact check AI-sourced content and adapt it with your expertise and experience from the local perspective.
Your data might not be private
AI tools, especially free ones, may store data you input. This could compromise confidentiality. It is therefore best to only use AI to help with the writing of the supporting evidence or project outlying plan.
If you input your organisation’s data into AI tools, ensure that you do so in compliance with relevant legislation.
Where personal information is used, please refer to the Information Commissioner’s Office for guidance on AI and data protection.
AI has an environmental impact
AI tools require large amounts of energy and fresh water to power their data centres. This creates significant environmental impact. Estimates suggest that ChatGPT consumes between 50 and 90 times more energy per query than a conventional search.
Use AI mindfully, only use AI where it will clearly help you apply or significantly improve the quality of your application.
Remember technology is evolving at an everchanging rate so keeping up to speed with what’s available and its compliance with data protection rules within our country are essential.