Frequently Asked Questions

Generative AI is not conscious or intelligent. It is a predictive text machine. This means that the outputs from AI may be inaccurate, off-topic, superficial, or illogical.

There are consequences of this:

  • Users need disciplinary knowledge to evaluate the outputs of AI platforms.
  • The AI output will be unsatisfactory for tasks requiring human judgment.
  • The AI output might not include accurate sources or references. Students will likely have to manually insert or check these. You can find out more about how students are being informed how to accurately reference by browsing the Library’s generative AI guidance.

Here is a list of examples of AI use, published by Open AI. These examples do not directly reference tasks such as essay or general text generation, a well-known function of large language models. If you are aware of students using generative AI for these purposes, it is important you play your part in staying informed of the latest developments with the particular platforms you and they might be regularly using.

The user inputs a prompt into the model, leading to the model offering a response.

A good example is shown with Prompt part 1 on the website of educationalist Philippa Hardman. In this example, the user inputs a Role, then a Task, then an Instruction. The user can then enter further prompts to obtain a refined or extended output.

As an Imperial staff member you already have access to a platform in which you can try out prompts – Microsoft Copilot. You can find out more by visiting our ICT guidance webpages.

There are many free short courses on the internet. You can browse a quick introduction to prompting on YouTube. The most practical advice is provided between 15’39 and 37’00.

See also: ‘How to Write an Effective Prompt in ChatGPT’ within the LinkedIn Learning resource ‘How to Research & Write Using Generative AI Tools. 

Imperial is also exploring whether we might develop our own short online courses on fundamental skills such as prompting, to support you and your students to use AI responsibly and effectively.

It is crucial to understand that all AI tools are continually evolving and may sound confident and persuasive even when presenting inaccurate information. This phenomenon has been well-documented and is known as AI hallucination.

If the topic is part of established academic understanding (e.g., physiology in Medicine) try using ChatGPT or Claude. You will need to fact-check what is being stated. Both tools are best used as a sounding board to explore new topics.

If the topic is on a current topic, try using Perplexity. Because it cites its information (often from multiple sources), it is very easy to fact-check the provided information. Tools like Perplexity could be used like a search engine. Note that citations are from the wider internet and not limited to scientific literature.

If the topic is on a research area, try using Scispace or Scite. Both tools enlist relevant papers to the search query. In our testing, both tools have often missed out enlisting seminal papers. However, they can be a reasonable starting point.

Scispace additionally summarises the enlisted papers from multiple perspectives. For example – methods used, limitations, practical implications etc. Note that the summaries are often over-simplified to be of value. Scite additionally has a widget that lists how many papers in the literature are in support or in opposition to an enlisted paper.

You can find out more about how students should be using AI as an information source by browsing the Library’s generative AI guidance.

Try uploading the PDF copy of the research article to Claude or SciSpace, and query the questions you want about the article. You can start by asking it to summarise the research article. You can then proceed to ask specific questions relevant to the article.

You can find out more about using AI to summarise and notate articles by browsing the Library’s generative AI guidance. This guidance is largely published with a student audience in mind.

Tool Image analysis
ChatGPT No
ChatGPT Plus Yes
Claude 3 Yes
Copilot Yes
Gemini Yes

Staff and sudents need to be aware of the potential issues of using sensitive or confidential data in AI systems. Imperial’s Library Services guidance states: “It is not advisable to add sensitive data (such as your name or other personal data) into generative AI tools, such as ChatGPT, as queries are stored and become part of the training data it draws upon.” 

There are fewer concerns with using Microsoft’s Copilot, as it does not learn from the information you input, and it does not harvest information from Imperial’s systems.

Notably, Excel also now features natural language data analysis functionality (see this video of Excel AI) meaning that students also have some AI data analysis options provided within the software provided by the university. This sits within the same Office 365 package as Copilot.

When used with the Notable plugin, the paid ChatGPT 4.0 subscription service can offer surprisingly detailed data analysis when provided with raw information, even with minimal prompting. (Some detailed examples can be viewed here). Integration between different platforms and formats (Jupyter Notebooks etc) is only likely to improve.

Copilot also now offers users access to ChatGPT 4.0 for free, meaning that staff and students can access these improvements without a paid subscription. Early tests in November 2023 have confirmed that graphs and images containing data and text can be analysed.

Copilot also allows users to generate detailed images using DALL-E 3.

Firstly, it is important to clarify that Imperial does not ban the use of generative AI. In fact, we encourage students to explore its potential and gain skills in this important technological field. We do however ask that students always acknowledge its use and that they can ensure they can thoroughly prove their own understanding of learning outcomes, independent of the use of any AI platforms.

It will sometimes be desirable or necessary for you as a member of teaching staff to encourage or discourage students from using AI tools to complete a practice or an assessed piece of work. Please ensure this guidance is clear and concise.

When students have been advised that AI is not to be used, or when they are told to use AI in a specific manner, they should not step outside of these parameters. If they do, staff should be ready to query whether students can demonstrate their understanding of learning outcomes.

This is a new and fast-moving area within education and society at large – it is to be expected that it will take some time for different academic disciplines to acclimitise to the opportunities and challenges of generative AI.

We recommend browsing the guides in the AI & Education Hub and familiarising yourself with some of the documentation intended for internal and external audiences in the Policies and Ethics section of the Hub.

No generative AI platform is perfect. This is a fast-moving field of technology, and privacy, copyright, and other important themes are changing from one day to the next.

Currently, our ICT team are reassured that Copilot has a resilient approach to data encryption, is already readily available across all the community’s devices, and has an accessible user interface.

Other free and paid-for platforms are available online and mentioned on these webpages. However, it is important to understand that we will not be able to offer you as much, or in many cases any, user support compared to products that are packaged within Office 365.

We seek to use these Hub webpages to introduce you to the main generative AI platforms available on the market, as well as their advantages and disadvantages.

It is highly relevant. Students should make all efforts to acknowledge their sources, as they would if they were not using AI.

Submitting work and assessments created by someone or something else, as if it was their own, is plagiarism and is a form of cheating on behalf of a student. This includes AI-generated content. Please refer to the university’s Academic Misconduct Procedures for further information.  To ensure quality assurance is maintained, departments may choose to invite a random selection of students to an ‘authenticity interview’ on their submitted assessments. This means asking students to attend an oral examination on their submitted work to ensure its authenticity, by asking them about the subject or how they approached their assignment. It should be made clear to students that being invited to an authenticity interview does not mean that there is any specific concern that they have submitted work that is not therr own. 

At this time, we do not intend to deploy any additional AI detection functionality due to concerns regarding the maturity of these products and their ability to accurately identify incidents of students utilising AI without the express permission of their teacher or outside the parameters of what has been agreed for their programme.
 
Our current approach, in line with many other universities in the UK, is to train staff to understand AI, identify its various uses, set parameters for those uses within students’ programmes, and be alert to the common features of AI-generated work. In turn, our students should also receive support from Imperial and proactively stay informed of the latest capabilities of AI platforms.

This approach is not prejudicial to Imperial deciding to review this decision in future, should we and the wider university sector have greater confidence in any technological solutions which may become available to detect the misuse of AI.

We value curiosity at Imperial and anticipate many students will feel excited and perhaps slightly uncertain about the development of such fast-evolving technology. It is likely that you feel the same way as a staff member! It is important to understand that for the foreseeable future it is likely that your peers and students may have varying levels of knowledge and experience in engaging with and discussing these platforms. This will change in time.

We therefore recommend that if a student wishes to have a simple, accessible introduction to generative AI, you direct them to our AI and Study Guidance Hub. This landing page features several similar resources to the staff hub webpage you are currently browsing, while being better suited to a student audience.

The Imperial community is welcome to contribute further questions and answers to this webpage by contacting the Education Office. We acknowledge and appreciate the significant contributions of the Faculty of Medicine AI Taskforce to the FAQs featured on this webpage.