Local GPT for Excel
Uses an implementation of WebLLM. Will be replaced with the official Prompt API for Edge when available.
Overview
Run unlimited, free, and private AI inference directly in Excel using a smaller language model (Gemma2 2B) that runs locally on your computer. Your data never leaves Excel. While not as fast or capable as ChatGPT, it may be sufficient for simple tasks.
Features
| Feature | Description |
|---|---|
| 🆓 Free | Unlimited free use |
| 💻 Local | Processed on your computer |
| 🔒 Private | No data is shared outside Excel |
Requirements
Your computer and browser must support WebGPU with the 16-bit floating point shader feature.
Checking support:
const adapter = await navigator.gpu.requestAdapter();
const supportsF16 = adapter?.features.has('shader-f16');Check your browser’s WebGPU support and 16-bit floating point shader feature availability using the browser console or compatibility tools.
Browser compatibility notes: - Chrome/Edge on Windows: The result above applies to Excel’s web runtime - Mac: Excel uses Safari, which has only experimental WebGPU support—may or may not work
System Requirements
| Resource | Requirement |
|---|---|
| RAM | At least 8 GB |
| Memory (in use) | 2–3 GB when running |
| Storage (cached) | ~1.5 GB |
Note: Text generation speed depends on your CPU and GPU. If it seems slow, that’s expected for slower hardware.
Function
This add-in provides a single general-purpose function.
GPT
=BOARDFLARE.GPT(prompt, [options])
| Parameter | Description |
|---|---|
prompt |
Instructions for the model (e.g., "summarize: " & A1) |
options |
Optional 2×n array with properties in column 1 and values in column 2 |
Available options:
| Property | Description |
|---|---|
system_message |
System prompt to guide model behavior |
max_tokens |
Maximum output length |
temperature |
Randomness (0 = deterministic, higher = more creative) |
Examples
See the demo workbook for working examples.
Basic Usage
Concatenate an instruction with a cell value:
=BOARDFLARE.GPT("What is the problem this user is having? Support Ticket: " & A1)
With Options
Include options as an array constant:
=BOARDFLARE.GPT("What is the problem this user is having? Support Ticket: " & A1,
{"system_message", "You are an expert summarizer"; "max_tokens", 100; "temperature", 0.5}
)
Using LAMBDA
Wrap in a LAMBDA function for reuse:
=LAMBDA(
ticket,
BOARDFLARE.GPT(
"What is the problem this user is having? Support Ticket: " & ticket,
{"system_message", "You are an expert summarizer"; "max_tokens", 100; "temperature", 0.5}
)
)
Name it GETPROBLEM so end-users can call =GETPROBLEM(A1). Named LAMBDA functions simplify usage and let you update the prompt once for the entire workbook.
With Array Values
Use ARRAYTOTEXT to insert arrays into prompts. If C4:E4 contains tags:
=BOARDFLARE.GPT(
"Which of these tags: (" & ARRAYTOTEXT(C4:E4) & ") best matches this support ticket: " & C3,
{"system_message", "You are classifying text, only return the value of the tag that best matches the text and nothing more."; "max_tokens", 5}
)
The system_message guides the model to return only the tag—otherwise, you may get a paragraph explaining the choice.
When constructing prompts, build them in a separate cell first to catch any mistakes before using them in the function.
Applications
Use Local GPT for simple or less critical tasks that don’t require a powerful AI model:
- Summarizing short text
- Simple classification
- Basic text extraction
- Draft generation
Attribution
This add-in is built with these open-source projects:
| Project | License |
|---|---|
| WebLLM | Apache 2.0 |
| Transformers.js | Apache 2.0 |
