AI_ASK
Overview
The AI_ASK
function enables Excel users to generate text-based responses using advanced AI models, such as Mistral AI, directly from their spreadsheets. It accepts a prompt and optional tabular data, returning a context-aware response suitable for summarization, analysis, or business writing. The function leverages the Mistral AI API, which provides state-of-the-art large language models for text generation and analysis. For more information, see the Mistral AI documentation and the official GitHub repository .
This example function is provided as-is without any representation of accuracy.
Usage
To use the function in Excel, enter it as a formula in a cell:
=AI_ASK(prompt, [data], [temperature], [max_tokens], [model], [api_key], [api_url])
prompt
(str, required): The question, instruction, or task for the AI model.data
(list[list], optional, default=None): 2D list (Excel range) to provide additional context.temperature
(float, optional, default=0.5): Controls randomness/creativity (0.0 to 2.0).max_tokens
(float, optional, default=250): Maximum number of tokens to generate (5 to 5000).model
(str, optional, default=“mistral-small-latest”): Model ID to use (e.g., ‘mistral-small-latest’).api_key
(str, optional, default=None): API key for authentication. Get a free API key from Mistral AI .api_url
(str, optional, default=“https://api.mistral.ai/v1/chat/completions ”): OpenAI-compatible API endpoint URL.
The function returns a string containing the AI-generated response based on the prompt and optional data.
Examples
Example 1: Employee Engagement Summary
Sample input data (Excel range A1:B5
):
Question | Score |
---|---|
Team collaboration | 4.5 |
Workload | 3.2 |
Career advancement | 3.0 |
Management support | 4.0 |
=AI_ASK("Summarize the key findings from the employee engagement survey:", A1:B5)
Expected output:
“The survey indicates high satisfaction with team collaboration but highlights concerns about workload and career advancement opportunities.”
Example 2: Quarterly Sales Analysis
Sample input data (Excel range A1:E4
):
Region | Q1 | Q2 | Q3 | Q4 |
---|---|---|---|---|
North | 120 | 135 | 150 | 160 |
South | 100 | 110 | 120 | 130 |
Central | 90 | 95 | 100 | 105 |
=AI_ASK("Provide a brief analysis of the quarterly sales performance:", A1:E4)
Expected output:
“Sales increased steadily across all regions, with the North region showing the highest growth in Q4.”
Example 3: Summarize Incident Report
Sample input text (cell A1
):
“On April 10th, a system outage affected order processing for 2 hours. The IT team resolved the issue by updating server configurations. No data loss occurred.”
=AI_ASK("Summarize the following incident report in one sentence:", A1)
Expected output:
“A brief system outage on April 10th was quickly resolved by IT with no data loss.”
Python Code
import requests
import json
def ai_ask(prompt, data=None, temperature=0.5, max_tokens=250, model="mistral-small-latest", api_key=None, api_url="https://api.mistral.ai/v1/chat/completions"):
"""
Generate a text response using an AI model based on a prompt and optional tabular data.
This function sends a prompt and optional 2D list data to a Mistral AI-compatible API endpoint and returns the generated response as a string. It is designed for use as a custom Excel function for text analysis, summarization, or business writing.
This example function is provided as-is without any representation of accuracy.
Args:
prompt (str): The question, instruction, or task for the AI model.
data (list[list], optional): 2D list (Excel range) to provide additional context. Default is None.
temperature (float, optional): Controls randomness/creativity (0.0 to 2.0). Default is 0.5.
max_tokens (float, optional): Maximum number of tokens to generate (5 to 5000). Default is 250.
model (str, optional): Model ID to use. Default is "mistral-small-latest".
api_key (str, optional): API key for authentication. Default is None.
api_url (str, optional): OpenAI-compatible API endpoint URL. Default is "https://api.mistral.ai/v1/chat/completions".
Returns:
str: The AI-generated response, or an error message if the request fails or parameters are invalid.
"""
if api_key is None or api_url is None:
if "idToken" in globals():
api_key = globals()["idToken"]
api_url = "https://llm.boardflare.com"
else:
return "Login on the Functions tab for limited demo usage, or sign up for a free Mistral AI account at https://console.mistral.ai/ and add your own api_key."
if not isinstance(temperature, (float, int)) or not (0 <= float(temperature) <= 2):
return "Error: temperature must be a float between 0 and 2 (inclusive)"
if not isinstance(max_tokens, (float, int)) or not (5 <= float(max_tokens) <= 5000):
return "Error: max_tokens must be a number between 5 and 5000 (inclusive)"
# Construct the message incorporating both prompt and data if provided
message = prompt
if data is not None:
data_str = json.dumps(data, indent=2)
message += f"\n\nData to analyze:\n{data_str}"
# Prepare the API request payload
payload = {
"messages": [{"role": "user", "content": message}],
"temperature": float(temperature),
"model": model,
"max_tokens": int(max_tokens)
}
headers = {
"Authorization": f"Bearer {api_key}",
"Content-Type": "application/json",
"Accept": "application/json"
}
# Make the API request
response = requests.post(api_url, headers=headers, json=payload)
if response.status_code == 429:
return "You have hit the rate limit for the API. Please try again later."
try:
response.raise_for_status()
response_data = response.json()
content = response_data["choices"][0]["message"]["content"]
return content
except Exception as e:
return f"Error: {str(e)}"