You can think of a prompt as a conversation starter or a guiding question you provide to the model to generate a specific response. It is the input text that you feed to a model and is like telling your AI friend what you want to chat about.


For example, if you want to use an LLM to summarize the text from a customer support ticket, you might ask the model to “summarize the input text”, or “summarize the main issues from the customer support ticket”.