top of page
Search

Build a domain‑guarded sales assistant with LangGraph, OpenAI, Amazon Lex, and Amazon Connect

  • Ian Willoughby
  • Sep 22
  • 6 min read
Build a domain‑guarded sales assistant with LangGraph, OpenAI, Amazon Lex, and Amazon Connect

Customers expect to reserve a storage unit as easily as ordering food. In this post, we show how to build a production‑ready self‑storage chatbot that works across voice (Amazon Connect + Amazon Lex V2) and web (static S3 site), orchestrated by a LangGraph agent backed by OpenAI. You’ll see the high‑level architecture, how the pieces fit together, and the business outcomes you can expect.

What you’ll build A domain‑locked assistant that only handles storage tasks: locations, unit availability, promotions, and reservations. A LangGraph agent with tool calling (today: fake data; later: your APIs). Multichannel entry points: voice via Amazon Connect + Lex, and web chat via a static S3 site. Durable session state, basic guardrails, and pragmatic operational patterns (timeouts, CORS, reset, etc.).

Business outcomes

  • Higher conversion: reduce friction from search to reservation; guide callers and web visitors to the right unit, fast.

  • Lower cost to serve: deflect routine “what do you have in 30312?” queries and after‑hours reservations to automation.

  • Consistent offers: centralize promotions logic so customers hear the same deals across channels.

  • Data you can act on: capture structured intent, size, features, and move‑in dates for downstream CRM/BI.

Solution overview

At the core is a LangGraph agent that holds your prompt, tools, and policy. Channel adapters (Lambda code hooks) send user turns into the agent and return the response. You can begin with fake data and later swap each tool for a real service without changing the conversation flow.

  

ree


Web uses the same bot via a lightweight Lex Web Proxy (Lambda Function URL) that your static S3 page calls with fetch(). The proxy relays requests to Lex and returns its messages with proper CORS and session handling.


Core components

1) LangGraph agent (Python)

  • Guardrails: A strict system prompt keeps the assistant in‑domain (storage only) and politely refuses out‑of‑scope queries.

  • Tools:

    • list_locations(city, zip_code)

    • check_availability(store_id, size, floor, climate, drive_up)

    • get_promotions(store_id)

    • reserve_unit(store_id, unit_id, customer_name, phone, email, move_in_date)

  • State: For Lex, the Lambda persists message history in DynamoDB keyed by session_id. For CLI demos, you can use an in‑memory checkpointer.

  • LLM: OpenAI chat model (e.g., gpt-4o-mini) with deterministic temperature for reliable tool calling.

Swap‑in later: Replace fake tool bodies with your inventory, pricing, and reservation APIs; keep the same signatures to avoid refactoring the conversation graph.

2) Lex + Lambda (voice & bot logic)

  • Lex bot (V2) with an alias that calls your Lambda code hook every turn.

  • The Lambda creates a LangGraph runtime and forwards user turns. It returns plain text messages for Lex to speak (or play via SSML/audio prompts if desired).

  • For long work (e.g., larger LLM chains), consider offloading from Lex to asynchronous Lambda in the Connect flow so callers hear progress tones while you compute (see “Performance patterns” below).

3) Web chat (static S3 + Function URL)

  • A single‑page site (HTML/CSS/JS) served from Amazon S3.

  • A minimal Lex Web Proxy (Lambda) with a Function URL handles browser requests:

    • Adds CORS headers and sanitizes input (do not echo Lex sessionState back).

    • For demos, supports a client “reset” action to start a fresh session and call Lex DeleteSession.

    • Keeps secrets and AWS permissions server‑side (no Cognito required for a quick demo).

4) Data & configuration

  • DynamoDB table (on‑demand) for session histories in the Lex path.

  • Environment variables for keys and bot IDs (use AWS Secrets Manager in production).

  • Optional Promotions table or service to centralize offers across channels.


Deployment path (high level)

  1. Create the DynamoDB table: storage-bot-sessions with PK session_id (S).

  2. Package the LangGraph Lambda: use a container image or SAM build with container so platform‑specific dependencies (e.g., pydantic-core) match Lambda’s OS.

  3. Create the Lex bot (V2):

    • One intent (e.g., SelfStorageChat) + AMAZON.FallbackIntent.

    • Dialog & Fulfillment code hooks enabled.

    • Provide Initial response and Fallback messages (prevents “blank ElicitIntent” runtime errors).

    • Build → Publish → attach the alias to your Lambda code hook.

  4. (Voice) Connect: create a flow that uses Get customer input (Lex) and Play prompt. For long tasks, add the Async Lambda + Wait + Play prompt + Load Lambda result loop.

  5. (Web) S3 site + proxy: create the Lex Web Proxy (Lambda Function URL) with CORS, upload index.html to S3, and point FUNCTION_URL to the proxy.

  6. Observability: enable CloudWatch logs and structured logging; test with a few happy‑ and edge‑cases.


Operations & architecture choices

Guardrails and safety

  • The system prompt refuses non‑storage topics and instructs the agent to use tools for facts (no fabrication of locations/prices/promos).

  • Never collect payment data in chat; hand off to a secure checkout.

  • Validate dates, emails, and phone numbers before calling reserve_unit.

Performance patterns

  • Keep the Lex turn fast. If a turn involves long reasoning or external APIs, push the heavy work to:

    • Asynchronous Lambda in the Connect flow (so you can play a tone while waiting), or

    • The web proxy path with a lightweight “typing” indicator in the UI.

  • Use a smaller/faster model (e.g., gpt-4o-mini) and cap max_tokens.

  • Warm Lambdas with Provisioned Concurrency, and set memory ≥ 1024 MB for better CPU/network.

Resilience & UX details

  • CORS: choose one source of truth (Function URL config or Lambda headers), not both.

  • Lex ElicitIntent: always provide a message if a path waits for user input; otherwise the runtime errors.

  • Session handling: use a client‑side inactivity TTL (e.g., 10 minutes) to rotate sessionId and show “Session expired.” Optionally call Lex DeleteSession from the proxy.

  • VPC egress: if Lambdas are in private subnets, add NAT or interface endpoints for the services you call.

Security

  • Least privilege IAM:

    • Lambda: only lex:RecognizeText (and lex:DeleteSession if used) on your bot alias ARN; DynamoDB table CRUD; Secrets Manager read for API keys.

    • Connect: only the specific Lambda ARNs in the flow.

  • Secrets: store the OpenAI key in Secrets Manager; load at startup.

  • CORS: restrict to your domain(s) in production.

Cost

  • Connect minutes, Lex requests, Lambda compute + requests, DynamoDB on‑demand, OpenAI usage.

  • Keep prompts concise, cap tokens, and reuse sessions to control cost.


Example elements (illustrative)

System prompt (excerpt)

You are StorageBot, a domain‑restricted assistant for a self‑storage company. Your ONLY job is to help customers find, compare, and reserve storage units. Use tools for locations, availability, promotions, and reservations. Politely refuse anything else.

Tool signatures (stable contract)

@tool
def list_locations(city: Optional[str] = None, zip_code: Optional[str] = None) -> List[Dict]: ...

@tool
def check_availability(store_id: str, size: Optional[str] = None,
                       floor: Optional[int] = None, climate: Optional[bool] = None,
                       drive_up: Optional[bool] = None) -> List[Dict]: ...

@tool
def get_promotions(store_id: str) -> List[Dict]: ...

@tool
def reserve_unit(store_id: str, unit_id: str, customer_name: str,
                 phone: str, email: str, move_in_date: str) -> Dict: ...

LangGraph wiring (sketch)

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0).bind_tools(TOOLS, parallel_tool_calls=False)
def assistant_node(state): return {"messages": [llm.invoke(state["messages"])]}
builder = StateGraph(AgentState)
builder.add_node("assistant", assistant_node)
builder.add_node("tools", ToolNode(TOOLS, handle_tool_errors=True))
builder.add_edge(START, "assistant")
builder.add_conditional_edges("assistant", tools_condition)
builder.add_edge("tools", "assistant")
graph = builder.compile()

What good looks like (rollout checklist)

  •  Lex bot has welcome + fallback messages; code hooks enabled; alias attached to Lambda; built & published.

  •  LangGraph agent returns short, actionable answers; always uses tools for facts.

  •  DynamoDB persists conversation state; PII is minimized.

  •  Web proxy enforces CORS, strips unsafe sessionState fields, and supports reset.

  •  Connect flow optionally uses Async Lambda + Wait + Play prompt for long work (no dead air).

  •  Alarming on Lambda errors/latency and Lex error rates.

  •  A/B experiments (e.g., “promo first” vs “size first”), measured against conversion to reservation.

Extending the solution

  • Real inventory & pricing: swap tool internals for your APIs; add availability caching.

  • Payments: hand off to a PCI‑compliant page; return reservation confirmation to the chat.

  • CRM/analytics: emit events (“unit_viewed”, “promo_applied”, “reservation_created”) to EventBridge or Kinesis for BI.

  • Multilingual: add more Lex locales; route by caller ANI or web Accept‑Language.

  • Human handoff: add an agent transfer in Connect when the bot detects frustration or repeated failure.

Clean up

To avoid ongoing charges in a test environment, remove:

  • The Connect phone number and contact flow.

  • The Lex bot alias (and optionally the bot), DynamoDB table, and Lambdas.

  • Any S3 buckets and CloudFront distributions hosting the web assets.


Conclusion

With a LangGraph agent at the core and Amazon Lex + Amazon Connect as channel adapters, you can ship a domain‑guarded self‑storage assistant that scales from a CLI demo to a production, omnichannel experience. Start with fake data, swap in your APIs, and layer on the operational patterns here to deliver a responsive, brand‑safe assistant that converts searches into reservations—day and night.

 
 
 

Comments


bottom of page