> For a complete documentation index, fetch https://docs.voximplant.ai/llms.txt

# Example: Answering an incoming call

> This example answers an inbound Voximplant call and bridges audio to Ultravox’s WebSocket API for real‑time speech‑to‑speech conversations.

<blockquote>
  For the complete documentation index, see <a href="/llms.txt">llms.txt</a>.
</blockquote>

This example answers an inbound Voximplant call and bridges audio to Ultravox’s WebSocket API for real‑time speech‑to‑speech conversations.

**⬇️ Jump to the [Full VoxEngine scenario](#full-voxengine-scenario).**

## Prerequisites

* Set up an inbound entrypoint for the caller:
  * Phone number: [https://voximplant.com/docs/getting-started/basic-concepts/phone-numbers](https://voximplant.com/docs/getting-started/basic-concepts/phone-numbers)
  * WhatsApp: [https://voximplant.com/docs/guides/integrations/whatsapp](https://voximplant.com/docs/guides/integrations/whatsapp)
  * SIP user / SIP registration: [https://voximplant.com/docs/guides/calls/sip](https://voximplant.com/docs/guides/calls/sip)
  * App user: [https://voximplant.com/docs/getting-started/basic-concepts/users](https://voximplant.com/docs/getting-started/basic-concepts/users) (see also [https://voximplant.com/docs/guides/calls/scenarios#how-to-call-a-voximplant-user](https://voximplant.com/docs/guides/calls/scenarios#how-to-call-a-voximplant-user))
* Create a routing rule that points the destination (phone number / WhatsApp / SIP username / app user alias) to this scenario: [https://voximplant.com/docs/getting-started/basic-concepts/routing-rules](https://voximplant.com/docs/getting-started/basic-concepts/routing-rules)
* Store your Ultravox API key in Voximplant [Secrets](/platform/voxengine/secrets) under `ULTRAVOX_API_KEY`.

## Session setup

Ultravox uses a WebSocket API client created via `Ultravox.createWebSocketAPIClient(...)`. The key inputs are:

* `endpoint`: use `Ultravox.HTTPEndpoint.CREATE_CALL` (or `CREATE_AGENT_CALL` for an existing Ultravox agent).
* `authorizations`: include `X-API-Key` with your Ultravox API key.
* `body`: provide `systemPrompt`, `model`, and `voice`.

## Connect call audio

Once the client is created, bridge audio both ways between the call and Ultravox:

```js title="Connect call audio"
VoxEngine.sendMediaBetween(call, voiceAIClient);
```

## Barge-in

Ultravox supports barge-in by providing a `PlaybackClearBuffer` event. If you want to interrupt the agent speech, listen for this
event and clear the media buffer with `clearMediaBuffer()`:

```js title="Barge-in"
voiceAIClient.addEventListener(Ultravox.WebSocketAPIEvents.PlaybackClearBuffer, (event) => {
    Logger.write("===PLAYBACK_CLEAR_BUFFER===");
    voiceAIClient.clearMediaBuffer();
});
```

The Ultravox server may also request a buffer clear via `PlaybackClearBuffer` — this example handles both cases.

## Events

The client supports both WebSocket API events and media events. Key events used in the example:

* `Ultravox.WebSocketAPIEvents`: `Transcript`, `HTTPResponse`, `State`, `Debug`, `WebSocketError`, `ConnectorInformation`, `Unknown`, `PlaybackClearBuffer`
* `Ultravox.Events`: `WebSocketMediaStarted`, `WebSocketMediaEnded`

## Full VoxEngine scenario

```javascript title={"voxeengine-ultravox-answer-incoming-call.js"} maxLines={0}
/**
 * Voximplant + Ultravox WebSocket API connector demo
 * Scenario: answer an incoming call and bridge it to Ultravox.
 */

require(Modules.Ultravox);
const SYSTEM_PROMPT = `You are Voxi, a helpful voice assistant for phone callers. 
Keep responses short and telephony-friendly (usually 1-2 sentences).`;

// -------------------- Ultravox WebSocket API settings --------------------
const AGENT_CONFIG = {
    systemPrompt: SYSTEM_PROMPT,
    model: "ultravox-v0.7",
    voice: "Mark",
};

VoxEngine.addEventListener(AppEvents.CallAlerting, async ({call}) => {
    let voiceAIClient;

    // Termination functions - add cleanup and logging as needed
    call.addEventListener(CallEvents.Disconnected, ()=>VoxEngine.terminate());
    call.addEventListener(CallEvents.Failed, ()=>VoxEngine.terminate());

    try {
        call.answer();
        // call.record({ hd_audio: true, stereo: true }); // Optional: record the call

        // Create client and wire media
        voiceAIClient = await Ultravox.createWebSocketAPIClient(
            {
                endpoint: Ultravox.HTTPEndpoint.CREATE_CALL,
                authorizations: {"X-API-Key": VoxEngine.getSecretValue('ULTRAVOX_API_KEY')},
                body: AGENT_CONFIG,
                onWebSocketClose: (event) => {
                    Logger.write("===ULTRAVOX_WEBSOCKET_CLOSED===");
                    if (event) Logger.write(JSON.stringify(event));
                    VoxEngine.terminate();
                },
            },
        );
        VoxEngine.sendMediaBetween(call, voiceAIClient);

        // ---------------------- Event handlers -----------------------
        // Barge-in: keep conversation responsive and capture transcript
        voiceAIClient.addEventListener(Ultravox.WebSocketAPIEvents.Transcript, (event) => {
            const payload = event?.data?.payload || event?.data || {};
            const role = payload.role;
            const text = payload.text || payload.delta;

            if (role && text) Logger.write(`===TRANSCRIPT=== ${role}: ${text}`);
        });

        voiceAIClient.addEventListener(Ultravox.WebSocketAPIEvents.PlaybackClearBuffer, (event) => {
            Logger.write("===PLAYBACK_CLEAR_BUFFER===");
            voiceAIClient.clearMediaBuffer();
        });

        // Consolidated "log-only" handlers - key Ultravox/VoxEngine debugging events
        [
            Ultravox.WebSocketAPIEvents.ConnectorInformation,
            Ultravox.WebSocketAPIEvents.HTTPResponse,
            Ultravox.WebSocketAPIEvents.State,
            Ultravox.WebSocketAPIEvents.Debug,
            Ultravox.WebSocketAPIEvents.WebSocketError,
            Ultravox.WebSocketAPIEvents.Unknown,
            Ultravox.Events.WebSocketMediaStarted,
            Ultravox.Events.WebSocketMediaEnded,
        ].forEach((eventName) => {
            voiceAIClient.addEventListener(eventName, (event) => {
                Logger.write(`===${event.name}===`);
                Logger.write(JSON.stringify(event));
            });
        });
    } catch (error) {
        Logger.write("===SOMETHING_WENT_WRONG===");
        Logger.write(error);
        voiceAIClient?.close();
        VoxEngine.terminate();
    }
});

```