> For a complete documentation index, fetch https://docs.voximplant.ai/llms.txt

# Example: Using Vertex AI

> This example answers an inbound Voximplant call and connects Gemini Live API through Vertex AI.

<blockquote>
  For the complete documentation index, see <a href="/llms.txt">llms.txt</a>.
</blockquote>

This example answers an inbound Voximplant call and connects Gemini Live API through Vertex AI.
It separates the Vertex service account credentials into a dedicated scenario so you can keep them isolated and reuse them across routes.

**⬇️ Jump to the [Full VoxEngine scenario](#full-voxengine-scenario).**

## Prerequisites

* Set up an inbound entrypoint for the caller:
  * Phone number: [https://voximplant.com/docs/getting-started/basic-concepts/phone-numbers](https://voximplant.com/docs/getting-started/basic-concepts/phone-numbers)
  * WhatsApp: [https://voximplant.com/docs/guides/integrations/whatsapp](https://voximplant.com/docs/guides/integrations/whatsapp)
  * SIP user / SIP registration: [https://voximplant.com/docs/guides/calls/sip](https://voximplant.com/docs/guides/calls/sip)
  * App user: [https://voximplant.com/docs/getting-started/basic-concepts/users](https://voximplant.com/docs/getting-started/basic-concepts/users) (see also [https://voximplant.com/docs/guides/calls/scenarios#how-to-call-a-voximplant-user](https://voximplant.com/docs/guides/calls/scenarios#how-to-call-a-voximplant-user))
* Create a routing rule that points the destination (phone number / WhatsApp / SIP username / app user alias) to this scenario: [https://voximplant.com/docs/getting-started/basic-concepts/routing-rules](https://voximplant.com/docs/getting-started/basic-concepts/routing-rules)
* Store your Vertex parameters in Voximplant `ApplicationStorage`:
  * `GCP_PROJECT_ID`
  * `GCP_REGION`

For full Vertex setup details, see: [https://voximplant.com/docs/voice-ai/google/vertex](https://voximplant.com/docs/voice-ai/google/vertex)
To create a service account key (JSON), see: [https://docs.cloud.google.com/iam/docs/keys-create-delete](https://docs.cloud.google.com/iam/docs/keys-create-delete)

## Credentials scenario (separate route entry)

Store the Vertex service account JSON in a **separate scenario** and include it **before** the main scenario in your routing rule.

Routing rule scenario order:

```
gemini-vertex-credentials → gemini-using-vertex-ai
```

<Info title="Why a separate scenario?">
  Vertex credentials JSON is typically too large for ApplicationStorage. VoxEngine shares global scope across scenarios in the same routing rule, so a credentials scenario can set a global variable for the main scenario to read.
</Info>

Example credentials scenario:

```javascript title={"voxeengine-gemini-vertex-credentials.js"} maxLines={0}
/**
 * Voximplant + Gemini Live API connector demo
 * Scenario: load Vertex AI credentials for Gemini Live API (example only).
 *
 * Include this scenario BEFORE the main Vertex AI scenario in your routing rule.
 */

// eslint-disable-next-line no-unused-vars
var GEMINI_VERTEX_CREDENTIALS = {
    type: "service_account",
    project_id: "your-gcp-project-id",
    private_key_id: "example-private-key-id",
    private_key: "-----BEGIN PRIVATE KEY-----\nREDACTED\n-----END PRIVATE KEY-----\n",
    client_email: "vertex-express@your-gcp-project-id.iam.gserviceaccount.com",
    client_id: "123456789012345678901",
    auth_uri: "https://accounts.google.com/o/oauth2/auth",
    token_uri: "https://oauth2.googleapis.com/token",
    auth_provider_x509_cert_url: "https://www.googleapis.com/oauth2/v1/certs",
    client_x509_cert_url: "https://www.googleapis.com/robot/v1/metadata/x509/vertex-express%40your-gcp-project-id.iam.gserviceaccount.com",
    universe_domain: "googleapis.com",
};

// This will be global to the next scenarios in the routing rule sequence.

```

<Warning title="Credentials are redacted">
  The example credentials are intentionally obfuscated and will not work. Create your own credentials scenario with your real service account JSON.
</Warning>

## Session setup

The Gemini Live API session is configured via `connectConfig`, passed into `Gemini.createLiveAPIClient(...)`.

In the full scenario, see `GEMINI_CONNECT_CONFIG`:

* `responseModalities: ["AUDIO"]` asks Gemini to speak back in real time.
* `apiVersion: "v1alpha"` matches the Vertex Live API endpoint.
* `speechConfig` selects a prebuilt voice.
* `systemInstruction` defines the assistant behavior.
* `tools` and `toolConfig` enable function calling.

The Vertex-specific parameters are passed directly to `Gemini.createLiveAPIClient(...)`:

* `backend: Gemini.Backend.VERTEX_AI`
* `project` → `GCP_PROJECT_ID`
* `location` → `GCP_REGION`
* `credentials` → the JSON string set by the credentials scenario

## Connect call audio

After `Gemini.LiveAPIEvents.SetupComplete`, bridge audio between the call and Gemini:

```js title="Connect call audio"
VoxEngine.sendMediaBetween(call, geminiLiveAPIClient);
```

The same handler sends a starter message to trigger the greeting:

```js title="Trigger the greeting"
geminiLiveAPIClient.sendClientContent({
  turns: [{ role: "user", parts: [{ text: "Say hello and ask how you can help." }] }],
  turnComplete: true,
});
```

## Events

The scenario logs Gemini events for debugging:

* `Gemini.LiveAPIEvents`: `SetupComplete`, `ServerContent`, `ToolCall`, `ToolCallCancellation`, `ConnectorInformation`, `Unknown`
* `Gemini.Events`: `WebSocketMediaStarted`, `WebSocketMediaEnded`

## Notes

* This example uses the Vertex AI backend (`Gemini.Backend.VERTEX_AI`).
* The credentials scenario only sets `GEMINI_VERTEX_CREDENTIALS` in the shared global scope and does not start any calls.

[See the VoxEngine API Reference for more details](https://voximplant.com/docs/references/voxengine/gemini).

## Full VoxEngine scenario

```javascript title={"voxeengine-gemini-using-vertex-ai.js"} maxLines={0}
/**
 * Voximplant + Gemini Live API connector demo
 * Scenario: answer an incoming call using Gemini Live API via Vertex AI.
 */

require(Modules.Gemini);
require(Modules.ApplicationStorage);

const SYSTEM_PROMPT = `You are Voxi, a helpful voice assistant for phone callers. Keep responses short and telephony-friendly (usually 1-2 sentences).`;

// -------------------- Gemini Live API settings --------------------
const CONNECT_CONFIG = {
    responseModalities: ["AUDIO"],
    speechConfig: {
        voiceConfig: {
            prebuiltVoiceConfig: {voiceName: "Aoede"},
        },
    },
    systemInstruction: {
        parts: [{text: SYSTEM_PROMPT}],
    },
    inputAudioTranscription: {},
    outputAudioTranscription: {},
};

VoxEngine.addEventListener(AppEvents.CallAlerting, async ({call}) => {
    let voiceAIClient;

    // Termination functions - add cleanup and logging as needed
    call.addEventListener(CallEvents.Disconnected, ()=>VoxEngine.terminate());
    call.addEventListener(CallEvents.Failed, ()=>VoxEngine.terminate());

    try {
        call.answer();
        // call.record({hd_audio: true, stereo: true}); // optional: call recording

        voiceAIClient = await Gemini.createLiveAPIClient({
            credentials: GEMINI_VERTEX_CREDENTIALS,                           // stored in a different scenario
            model: "gemini-live-2.5-flash-preview-native-audio-09-17",
            backend: Gemini.Backend.VERTEX_AI,                                // Set the Vertex back-end
            project: (await ApplicationStorage.get("GCP_PROJECT_ID")).value,  // required for Vertex
            location: (await ApplicationStorage.get("GCP_REGION")).value,     // required for Vertex
            connectConfig: CONNECT_CONFIG,
            onWebSocketClose: (event) => {
                Logger.write("===Gemini.WebSocket.Close===");
                if (event) Logger.write(JSON.stringify(event));
                VoxEngine.terminate();
            },
        });

        // ---------------------- Event handlers -----------------------
        // Wait for Gemini setup, then bridge audio and trigger the greeting
        voiceAIClient.addEventListener(Gemini.LiveAPIEvents.SetupComplete, () => {
            VoxEngine.sendMediaBetween(call, voiceAIClient);
            voiceAIClient.sendClientContent({
                turns: [{role: "user", parts: [{text: "Say hello and ask how you can help."}]}],
                turnComplete: true,
            });
        });

        // Capture transcripts + handle barge-in
        voiceAIClient.addEventListener(Gemini.LiveAPIEvents.ServerContent, (event) => {
            const payload = event?.data?.payload || {};
            if (payload.inputTranscription?.text) {
                Logger.write(`===USER=== ${payload.inputTranscription.text}`);
            }
            if (payload.outputTranscription?.text) {
                Logger.write(`===AGENT=== ${payload.outputTranscription.text}`);
            }
            if (payload.interrupted) {
                Logger.write("===BARGE-IN=== Gemini.LiveAPIEvents.ServerContent");
                voiceAIClient.clearMediaBuffer();
            }
        });

        // Log all Gemini events for illustration/debugging
        [
            Gemini.LiveAPIEvents.SetupComplete,
            Gemini.LiveAPIEvents.ServerContent,
            Gemini.LiveAPIEvents.ToolCall,
            Gemini.LiveAPIEvents.ToolCallCancellation,
            Gemini.LiveAPIEvents.ConnectorInformation,
            Gemini.LiveAPIEvents.Unknown,
            Gemini.Events.WebSocketMediaStarted,
            Gemini.Events.WebSocketMediaEnded,
        ].forEach((eventName) => {
            voiceAIClient.addEventListener(eventName, (event) => {
                Logger.write(`===${event.name}===`);
                if (event?.data) Logger.write(JSON.stringify(event.data));
            });
        });
    } catch (error) {
        Logger.write("===SOMETHING_WENT_WRONG===");
        Logger.write(error);
        VoxEngine.terminate();
    }
});

```