1️⃣ Why Prompt Builder Matters for Developers
| Business Need | How Prompt Builder Solves It |
|---|---|
| Dynamic content generation (e.g., email drafts, product descriptions) | LLM‑driven text generation directly from Salesforce data |
| Personalized UI (real‑time suggestions, chat‑style assistants) | Prompt templates can be called from LWC with a single Apex method |
| Rapid prototyping | No need to host external LLM services; everything lives inside the platform |
| Governance & Auditing | All prompt executions are logged in the Prompt Builder Execution object |
Bottom line: Prompt Builder gives developers a low‑code, secure, and auditable way to embed generative AI inside Lightning Web Components.
2️⃣ Architecture Overview
+-------------------+ +-------------------+ +-------------------+
| Lightning Web | Apex | Prompt Builder | LLM | External LLM |
| Component (LWC) | <-----> | Service (API) | <-----> | (e.g., Gemini) |
+-------------------+ +-------------------+ +-------------------+
Key Flow:
1️⃣ LWC calls Apex @InvocableMethod → 2️⃣ Apex invokes Prompt Builder → 3️⃣ Prompt Builder sends request to LLM → 4️⃣ LLM returns generated text → 5️⃣ Apex returns result to LWC → 6️⃣ UI updates.
3️⃣ Step‑by‑Step Implementation
3.1 Create a Prompt Builder Template
- Navigate → Setup → Prompt Builder → Templates → New
- Name:
ContactSentimentPrompt - Prompt Body (use
{{ }}for merge fields):
You are an empathetic sales assistant. Analyze the following customer message and return a concise sentiment (Positive, Neutral, Negative) and a one‑sentence recommendation.
Message: "{{Contact.Message__c}}"
-
Variables:
Contact.Message__c– Text (required)
-
Test the prompt with a sample message to verify output format (
Positive – Follow‑up with a call). -
Save and note the Template Id (e.g.,
0X9B0000000A1bX).
3.2 Apex Wrapper Class
Create a reusable Apex class that calls the Prompt Builder API.
public with sharing class PromptBuilderService {
// Replace with your org's Prompt Builder endpoint
private static final String ENDPOINT = URL.getSalesforceBaseUrl().toExternalForm()
+ '/services/data/v57.0/promptBuilder/templates/';
@AuraEnabled(cacheable=true)
public static String runPrompt(String templateId, Map<String, Object> variables) {
// Build request body
Map<String, Object> body = new Map<String, Object>{
'variables' => variables
};
// Perform HTTP callout
HttpRequest req = new HttpRequest();
req.setEndpoint(ENDPOINT + templateId + '/run');
req.setMethod('POST');
req.setHeader('Authorization', 'Bearer ' + UserInfo.getSessionId());
req.setHeader('Content-Type', 'application/json');
req.setBody(JSON.serialize(body));
Http http = new Http();
HTTPResponse res = http.send(req);
if (res.getStatusCode() != 200) {
throw new AuraHandledException('Prompt Builder error: ' + res.getBody());
}
// Expected response: {"output":"Positive – Follow‑up with a call"}
Map<String, Object> result = (Map<String, Object>) JSON.deserializeUntyped(res.getBody());
return (String) result.get('output');
}
}
Security Note: The class is
with sharingand uses the current user’s session ID, ensuring the call respects the user’s CRUD/FLS permissions.
3.3 Lightning Web Component (LWC)
File Structure
lwc/
└─ sentimentAnalyzer/
├─ sentimentAnalyzer.html
├─ sentimentAnalyzer.js
└─ sentimentAnalyzer.js-meta.xml
sentimentAnalyzer.html
<template>
<lightning-card title="Contact Sentiment Analyzer">
<div class="slds-p-around_medium">
<lightning-textarea label="Customer Message"
value={message}
onchange={handleMessageChange}
placeholder="Paste the customer message here...">
</lightning-textarea>
<lightning-button variant="brand"
label="Analyze Sentiment"
onclick={analyze}
class="slds-m-top_small"
disabled={isAnalyzing}>
</lightning-button>
<template if:true={sentiment}>
<p class="slds-m-top_medium">
<strong>Result:</strong> {sentiment}
</p>
</template>
<template if:true={errorMsg}>
<p class="slds-text-color_error">{errorMsg}</p>
</template>
</div>
</lightning-card>
</template>
sentimentAnalyzer.js
import { LightningElement, track } from 'lwc';
import runPrompt from '@salesforce/apex/PromptBuilderService.runPrompt';
export default class SentimentAnalyzer extends LightningElement {
@track message = '';
@track sentiment = '';
@track errorMsg = '';
@track isAnalyzing = false;
handleMessageChange(event) {
this.message = event.target.value;
}
async analyze() {
this.isAnalyzing = true;
this.sentiment = '';
this.errorMsg = '';
try {
const variables = {
Contact: {
Message__c: this.message
}
};
// Template Id from the Prompt Builder template you created
const templateId = '0X9B0000000A1bX';
const result = await runPrompt({ templateId, variables });
this.sentiment = result;
} catch (err) {
this.errorMsg = err.body?.message || err.message || 'Unexpected error';
} finally {
this.isAnalyzing = false;
}
}
}
sentimentAnalyzer.js‑meta.xml
<?xml version="1.0" encoding="UTF-8"?>
<LightningComponentBundle xmlns="http://soap.sforce.com/2006/04/metadata">
<apiVersion>57.0</apiVersion>
<isExposed>true</isExposed>
<targets>
<target>lightning__RecordPage</target>
<target>lightning__AppPage</target>
<target>lightning__HomePage</target>
</targets>
</LightningComponentBundle>
3.4 Deploy & Test
- Deploy the Apex class and LWC via VS Code (
sfdx force:source:push). - Add the component to a Record Page (e.g., Contact).
- Paste a sample customer message, click Analyze Sentiment, and verify the result appears.
Tip: Use the Prompt Builder Execution object (Setup → Prompt Builder → Executions) to audit each call and confirm that the correct user context is applied.
4️⃣ Best Practices & Gotchas
| Practice | Reason |
|---|---|
| Keep prompts under 500 tokens | Reduces latency and cost; LLMs have token limits. |
| Validate user input | Prevent injection attacks; strip HTML tags before sending to Prompt Builder. |
| Cache frequent results | Store recent sentiment results in a Custom Setting to avoid duplicate calls. |
Use @AuraEnabled(cacheable=true) only when the prompt does not depend on mutable data (e.g., static templates). |
|
| Log execution | Enable Prompt Builder Auditing to satisfy compliance requirements. |
| Graceful fallback | If the LLM service is unavailable, show a friendly message and optionally route to a manual review queue. |
5️⃣ Security & Governance Checklist
- CRUD/FLS – Ensure the Apex class respects field‑level security for
Contact.Message__c. - Data Privacy – Do not send PII (SSN, credit‑card numbers) to the LLM. Mask or hash before calling Prompt Builder.
- Audit Trail – Verify that each Prompt Builder execution is recorded in the Prompt Builder Execution object.
- Rate Limits – Monitor the Prompt Builder API Usage dashboard to avoid hitting the daily quota.
6️⃣ Extending the Pattern
| Use‑Case | How to Extend |
|---|---|
| Multi‑step conversation | Chain multiple Prompt Builder templates and store intermediate results in a Custom Metadata record. |
| Dynamic language selection | Add a language variable to the template and pass the user’s locale from LWC. |
| Image generation | Use a Prompt Builder template that calls an image‑generation model (e.g., DALL·E) and return a URL to display in the component. |
| Batch processing | Write a Scheduled Apex job that runs Prompt Builder on a list of records and writes results back to custom fields. |
7️⃣ TL;DR – Quick Reference
1️⃣ Create Prompt Builder template → note Template Id.
2️⃣ Build Apex wrapper (runPrompt) that POSTs to /promptBuilder/templates/{id}/run.
3️⃣ Expose wrapper via @AuraEnabled.
4️⃣ Build LWC that:
• Captures user input.
• Calls Apex method with variables.
• Displays the generated text.
5️⃣ Deploy, test, and audit.
8️⃣ Final Thoughts
Prompt Builder is the bridge between declarative AI capabilities and the custom UI power of Lightning Web Components. By mastering this integration, you’ll be able to:
- Deliver real‑time, AI‑driven experiences without leaving the Salesforce ecosystem.
- Keep governance, security, and auditability under tight control.
- Accelerate prototyping—what used to take weeks of external API work can now be built in a few hours.
Start with the simple sentiment analyzer above, then expand to more sophisticated use‑cases like dynamic email drafting, knowledge‑base summarization, or auto‑generated code snippets. The possibilities are limited only by your imagination—and the token limits of the underlying LLM. Happy building!