Using TanStack DB & ElectricSQL to build real-time apps with optimistic updates
An Integration Guide to TanStack DB & ElectricSQL: Building Real-Time Apps with Optimistic Updates
Abstract
This document is an in-depth technical guide providing a clear, reusable example of how to combine TanStack DB and ElectricSQL to build modern web applications with real-time data synchronization, secure optimistic updates, and transactional consistency. We will explore the core architecture and best practices through a Q&A application centered around a messages table.
This guide was written based on the following key dependency versions:
@electric-sql/client: ^1.0.4
@electric-sql/experimental: ^1.0.4
@electric-sql/react: ^1.0.4
@tanstack/db: ^0.4.1
@tanstack/react-db: ^0.1.23
@tanstack/electric-db-collection: ^0.1.25
Architecture Overview
Our Q&A application architecture is divided into a client and a server, communicating via API calls and a real-time data stream. The interaction model of the core components is as follows:
1. Foundation: The Shape Gatekeeper Security Model
To securely sync the database to the client in real-time, we cannot expose it directly. ElectricSQL uses the concept of “Shapes” to define subsets of data that a client can subscribe to. Our architecture adds a “Shape Gatekeeper” layer to validate and authorize these data subscription requests.
1.1. Dual JWT Authentication
Our authentication system employs a dual JWT model for flexible and secure data access control:
Auth Token: A standard JWT used to verify the user’s identity, issued by the main authentication service.
Shape Token: A short-lived, special-purpose JWT. Before subscribing to a specific data table (a Shape), the client must exchange its valid Auth Token for a Shape Token from the backend. The claims of this token contain the exact table and query conditions (e.g., WHERE conversation_id = 'conv-abc') it is allowed to access, thus enabling row-level security.
// File path: apps/api/src/routes/main.ts // A Hono-based backend route implementation
mainApp.post( "/shapes/auth-token", jwtAuth, // Middleware to validate the user's Auth Token zValidator("json", z.object({ table: z.string(), params: z.any() })), async (c) => { const { table, params } = c.req.valid("json"); const user = c.get("user"); // User info parsed from the Auth Token
letwhereClause: string | undefined;
// Dynamically generate the SQL WHERE clause based on the requested table and user identity if (table === "messages") { const conversationId = params?.conversationId; if (!conversationId) { // Disallow access to any messages if conversationId is not provided whereClause = `conversation_id = ''`; } else { // Business logic to verify user's ownership of the `conversationId` should be here. // For example, query the database to confirm the `conversationId` belongs to the current `user.userId`. const hasPermission = awaitcheckConversationPermission(user.userId, conversationId); if (!hasPermission) { thrownewForbiddenError("No permission to access messages of this conversation"); }
// Authorize access to all messages in this conversation whereClause = `conversation_id = '${conversationId}'`; } } else { thrownewForbiddenError(`Access to table not allowed: ${table}`); } // Encode the authorization info (table name and WHERE clause) into a new Shape Token const shapeToken = awaitJWTService.generateShapeToken({ table, where: whereClause });
return c.json({ token: shapeToken }); } );
1.3. Step 2: Secure Shape Proxy Endpoint (/shape)
After obtaining a short-lived Shape Token, the client does not request the ElectricSQL service directly. Instead, it requests our custom /shape proxy endpoint. This endpoint’s responsibility is to validate the Shape Token and securely forward the request to the actual ElectricSQL service.
mainApp.get( "/shape", shapeAuth, // Key middleware: validates the Shape Token async (c) => { // 1. Get authorization parameters from the JWT payload. // The `shapeAuth` middleware has already validated the JWT and placed its payload // in the request context. This is the source of truth, not the client's URL query params. const { table, where } = c.get("shapePayload"); const electricServiceUrl = newURL(`${process.env.ELECTRIC_API_URL}/v1/shape`);
// 3. Construct the final request using the validated parameters from the JWT. // This prevents the client from tampering with the request to access unauthorized tables or rows. electricServiceUrl.searchParams.set("table", table); if (where) { electricServiceUrl.searchParams.set("where", where); }
// 4. Proxy the well-formed, secure request to the ElectricSQL service. // The proxy function is from the 'hono/proxy' library. returnproxy(electricServiceUrl.toString(), { ...c.req, headers: { ...c.req.headers, }, }); });
1.4. Client-side Collection Definition
On the client, we create a createMessagesCollection factory function that encapsulates all the logic for obtaining a Shape Token and configuring ElectricSQL’s data synchronization.
1.5. Deep Dive: The createShapeConfig Factory for Resilient Connections
The core value of createShapeConfig is its onError callback, which provides an elegant, automated connection recovery mechanism for handling expired Shape Tokens. The implementation details are provided in the appendix.
1.6. Appendix: Client-side Helper Function Implementations
To make this document self-contained, simplified versions of key helper functions are provided below.
// In a real application, this URL might be dynamic (e.g., from .env or a Zustand store) exportfunctiongetApiBaseUrl(): string { return'https://your-api.example.com'; }
// In a real application, API call functions would use fetch or axios exportasyncfunctionsendQuestionApi(params: any): Promise<{ txid: string }> { const response = awaitfetch(`${getApiBaseUrl()}/questions/send`, { method: 'POST', headers: { 'Content-Type': 'application/json', 'Authorization': `Bearer ${getAuthToken()}`, // Function to get the user's Auth Token }, body: JSON.stringify(params), }); if (!response.ok) thrownewError('API request failed'); return response.json(); }
constgetAuthHeader = async () => { // Check cache first const cachedToken = useShapeTokenStore.getState().getToken(cacheKey); if (cachedToken) return`Bearer ${cachedToken}`;
// On cache miss, fetch a new token const newShapeToken = awaitfetchShapeToken(table, params); useShapeTokenStore.getState().setToken(cacheKey, newShapeToken); return`Bearer ${newShapeToken}`; };
constonErrorFunc = async (error: Error) => { // Handle only 401 errors, which usually mean an expired token if (error instanceofFetchError && error.status === 401) { console.warn(`[Shape Auth] 401 Unauthorized. Invalidating token and retrying...`); // Key step: remove the expired token from the cache useShapeTokenStore.getState().removeToken(cacheKey); // Return an empty object to signal to ElectricSQL that this is a recoverable error, and it should retry automatically return {}; } throw error; };
When a user submits a new question, we need to update the client UI, call the backend API, and ensure eventual data consistency. This is where TanStack DB’s createTransaction and ElectricSQL’s txid mechanism come into play.
2.1. Transactional Action (sendQuestionAction)
We encapsulate the operation of sending a new question into an atomic, asynchronous Action function.
exportasyncfunctionsendQuestionAction(params: { conversationId: string; questionContent: string; questionId: string; // Pre-generated on the client answerId: string; // Pre-generated on the client }) { const { conversationId, questionContent, questionId, answerId } = params; const messagesCollection = createMessagesCollection(conversationId);
const transaction = createTransaction({ autoCommit: false, mutationFn: async () => { // 1. Call the backend API const response = awaitsendQuestionApi({ conversationId, questionId, answerId, content: questionContent, }); if (!response.txid) thrownewError('API did not return a transaction ID');
// 2. Wait for ElectricSQL to sync this transaction back to the client. // This is the key to the seamless "optimistic -> real data" transition. // IMPORTANT: awaitTxId must be awaited, otherwise optimistic updates will be ignored. await messagesCollection.utils.awaitTxId(response.txid); }, });
// Optimistic Update // This part executes synchronously and immediately, making the UI feel instant. transaction.mutate(() => { const questionMessage = createQuestionPlaceholder({ id: questionId, conversation_id: conversationId, content: questionContent, }); const answerMessage = createAnswerPlaceholder({ id: answerId, conversation_id: conversationId, }); messagesCollection.insert([questionMessage, answerMessage]); });
// Commit the transaction, which triggers the execution of `mutationFn` await transaction.commit(); }
2.1.1. Appendix: Placeholder Object Creation
The createQuestionPlaceholder and createAnswerPlaceholder functions used in the optimistic update are simple factories for creating objects that match the local database Message schema.
The txid is the bridge between the client’s action and the backend’s data synchronization. In our backend’s message.service.ts, this ID is obtained via a transaction helper function that wraps all database write operations. Its core relies on a built-in PostgreSQL function, pg_current_xact_id().
/** * A transaction wrapper that ensures all operations run within a transaction and returns the transaction ID. */ privateasync withTransaction<T>( callback: (tx: DrizzleTransaction, txid: number) =>Promise<T>, ): Promise<T> { // Use Drizzle's transaction method to start a database transaction returnawait db.transaction(async (tx) => { // Inside the transaction, first get the current transaction ID const txid = awaitthis.getTxId(tx); // Then, execute the callback containing all database operations returnawaitcallback(tx, txid); }); }
/** * Gets the current transaction's ID. * Internal function used by various transaction wrappers * * Uses pg_current_xact_id()::xid::text to get the 32-bit internal xid, * which matches the ID sent in PostgreSQL logical replication streams, * and is the ID we use for matching in Electric */ privateasyncgetTxId(tx: DrizzleTransaction): Promise<number> { try { // Use pg_current_xact_id()::xid::text to get the 32-bit internal xid // ::xid conversion removes the epoch, giving the raw 32-bit value // This matches the value PostgreSQL sends in logical replication streams const result = await tx.execute(sql`SELECT pg_current_xact_id()::xid::text as txid`); const txid = (result.rows[0] as { txid: string })?.txid;
if (txid === undefined) { thrownewError("Failed to get transaction ID from pg_current_xact_id"); }
returnNumber.parseInt(txid, 10); } catch (error) { // Fallback to deprecated txid_current() console.warn("[Transaction] pg_current_xact_id() failed, falling back to txid_current():", error); try { const result = await tx.execute(sql`SELECT txid_current() as txid`); const txid = (result.rows[0] as { txid: bigint })?.txid;
if (txid === undefined) { thrownewError("Failed to get transaction ID from txid_current"); }
returnNumber(txid); } catch (fallbackError) { console.error("[Transaction] Both txid methods failed:", fallbackError); thrownewError("Failed to get transaction ID from all available methods"); } } }
/** * Creates the question and answer placeholders in a single transaction. */ asynccreateQuestionAndAnswerPlaceholder(params: ...): Promise<{ txid: number }> { // Execute operations via withTransaction to ensure atomicity and get the txid returnthis.withTransaction(async (tx, txid) => { // 1. Insert question message... await tx.insert(messages).values({ ... });
// 2. Insert answer placeholder... await tx.insert(messages).values({ ... }); // 3. Return the txid obtained from the wrapper return { txid }; }); } }
The beauty of this design is that the withTransaction method encapsulates the entire flow: “start transaction, get ID, execute operations, commit transaction.” Any method needing to write to the database atomically and requiring a txid for ElectricSQL sync can simply use this helper.
2.3. How awaitTxId Works
The txid is the key that bridges the gap between the frontend’s optimistic update and the backend’s true data. The await collection.utils.awaitTxId(txid) function works as follows:
Listen to the Replication Stream: It registers a temporary listener with ElectricSQL’s client-side runtime.
Match the Transaction ID: ElectricSQL receives the data replication stream from PostgreSQL. This stream includes metadata for each transaction, including its txid.
Resolve the Promise: When the client-side runtime receives a data change that matches the txid it is waiting for, it knows that this specific transaction has been successfully synced from the server to the client’s local database. At this point, the awaitTxId promise is resolved.
This mechanism elegantly guarantees that when the commit() function returns, our local database not only contains the optimistic placeholders but that these placeholders have already been overwritten by the persisted, true data from the server.
Critical Requirement: awaitTxIdmust be awaited. If you don’t use await, the UI will flicker and optimistic updates won’t work properly.
3. The Payoff: Real-time UI with useLiveQuery
With a secure data channel and reliable data operations in place, we can now reap the benefits in the UI layer. TanStack DB’s useLiveQuery hook allows us to effortlessly subscribe to changes in a Collection. The latest version of TanStack DB now supports loading states, making it easier to handle initial data synchronization and empty states.
functionQuestionAnswerList({ conversationId }: { conversationId: string }) { // Subscribe to the message collection for a specific conversation const messagesCollection = useMemo(() =>createMessagesCollection(conversationId), [conversationId]);
// The new useLiveQuery syntax with destructured data and loading state const { data: messages, isLoading, } = useLiveQuery(q => q.from({ messagesCollection }) .orderBy(({ messagesCollection }) => messagesCollection.created_at, 'desc') );
// Handle loading state - when there's no data, it stays in loading state if (isLoading) { return<divclassName="loading">Loading messages...</div>; }
When a user submits a new question, sendQuestionAction’s transaction.mutate() immediately inserts the placeholders into the local database. useLiveQuery detects this change, and the QuestionAnswerList component re-renders instantly, showing the new question and “Generating answer…”, achieving a perfect optimistic update. When the backend’s answer generation task completes and updates the database, ElectricSQL syncs the change back to the client. useLiveQuery triggers another re-render, updating the answer placeholder with the real response.
4. End-to-End Flow
Let’s summarize the entire process with a sequence diagram:
sequenceDiagram
participant User
participant Component as React Component
participant Action as sendQuestionAction
participant LocalDB as TanStack/Electric DB
participant API as Backend API
participant Postgres
participant Electric as ElectricSQL Replication
User->>Component: Clicks "Send" button
Component->>Action: sendQuestionAction({ content: 'Hello!' })
Action->>LocalDB: Optimistically insert Question & Answer placeholders
Note right of LocalDB: UI instantly shows question and "Generating answer..."
Action->>API: POST /questions/send
API->>Postgres: BEGIN TRANSACTION
API->>Postgres: INSERT question message
API->>Postgres: INSERT answer placeholder
API->>Postgres: COMMIT
Postgres-->>API: Returns txid
API-->>Action: Returns { success: true, txid: '...' }
Action->>LocalDB: await collection.utils.awaitTxId(txid)
Note right of LocalDB: Action pauses, listening for replication
Postgres-->>Electric: Replicates committed transaction
Electric-->>LocalDB: Pushes changes to client
LocalDB-->>Action: awaitTxId promise resolves
par Background Answer Generation
API->>API: Generate Answer
API->>Postgres: UPDATE answer message SET content = '...'
end
Postgres-->>Electric: Replicates answer
Electric-->>LocalDB: Pushes final reply to client
Note right of LocalDB: UI automatically updates to show the final reply
With this architecture, we have successfully built a powerful real-time application in a declarative, fault-tolerant, and efficient manner. This pattern elegantly encapsulates the complexities of UI updates, data persistence, and real-time synchronization, allowing developers to focus on implementing business logic.
5. Troubleshooting
5.1. JSONB Fields and TypeScript Types
When working with PostgreSQL JSONB fields in ElectricSQL, you’ll encounter some important limitations and performance considerations:
Performance Impact
⚠️ WARNING: Avoid using JSONB fields for UI-critical data that is frequently accessed or rendered. JSONB fields can severely impact UI performance because:
Each access requires parsing the JSON structure
React re-renders are triggered for the entire JSONB object even when only a nested property changes
Large JSONB structures can cause significant memory overhead
Best Practice: Use JSONB only for supplementary data that is:
Rarely accessed in the UI
Not part of the core rendering logic
Used for metadata, configuration, or audit trails
TypeScript Type Issues
ElectricSQL’s client SDK converts all JSONB fields to the generic Json type, which loses all type information. There is currently no automatic way to preserve types, requiring manual type assertions.
// Type inference from Drizzle typeMessage = typeof messages.$inferSelect; // Message.metadata type is: { author: ..., tags: ..., priority: ... } | null
// What ElectricSQL returns after synchronization const selectMessageSchema = createSelectSchema(messages); typeElectricMessage = z.infer<typeof selectMessageSchema>; // ElectricMessage.metadata type is: Json | null (type information lost!)
Solution - Transform Functions with Type Assertions:
// Custom parser for date handling and type restoration const parser = { timestamp: (date: string) =>newDate(date), } asunknownasShapeOptions['parser'];
// Single WHERE clause with subquery handles everything whereClause = `conversation_id IN ( SELECT c.id FROM conversations c JOIN conversation_members cm ON c.id = cm.conversation_id WHERE cm.user_id = '${user.userId}' AND cm.role IN ('owner', 'member') )`;