*** ## title: 'Example: Placing an outbound call' This example starts a VoxEngine session, places an outbound PSTN call, and bridges audio to OpenAI Realtime once the callee answers. **Jump to the [Full VoxEngine scenario](#full-voxengine-scenario).** ## Prerequisites * Store your OpenAI API key in Voximplant `ApplicationStorage` under `OPENAI_API_KEY`. * Ensure outbound calling is enabled for your Voximplant application and that your caller ID is verified. ## Outbound call parameters The example expects destination and caller ID in `customData` (read via `VoxEngine.customData()`): ```json title="Custom data example" {"destination":"+15551234567","callerId":"+15557654321"} ``` ## Launch the routing rule For quick testing, you can start this outbound scenario from the Voximplant Control Panel: 1. Open your Voximplant application and go to the **Routing** tab. 2. Select the routing rule that has this scenario attached. 3. Click **Run**. 4. Provide **Custom data** (max 200 bytes) with `destination` and `callerId`: ```json title="Custom data example" {"destination":"+15551234567","callerId":"+15557654321"} ``` For production, start the routing rule via Management API `startScenarios` (pass `rule_id`, and pass the same JSON string in `script_custom_data`): [https://voximplant.com/docs/references/httpapi/scenarios#startscenarios](https://voximplant.com/docs/references/httpapi/scenarios#startscenarios) ## Connect call audio After the callee answers, the example bridges audio both ways: ```js title="Connect call audio" VoxEngine.sendMediaBetween(call, voiceAIClient); ``` ## Barge-in ```js title="Barge-in" voiceAIClient.addEventListener(OpenAI.RealtimeAPIEvents.InputAudioBufferSpeechStarted, () => { voiceAIClient.clearMediaBuffer(); }); ``` ## Notes [See the VoxEngine API Reference for more details](https://voximplant.com/docs/references/voxengine/openai). ## Full VoxEngine scenario ```javascript title={"voxeengine-openai-place-outbound-call.js"} maxLines={0} /** * Voximplant + OpenAI Realtime API connector demo * Scenario: place an outbound PSTN call and bridge it to OpenAI Realtime. */ require(Modules.OpenAI); require(Modules.ApplicationStorage); const SYSTEM_PROMPT = ` You are Voxi, a concise phone assistant for outbound calls. Keep responses short and helpful. `; const SESSION_CONFIG = { session: { type: "realtime", instructions: SYSTEM_PROMPT, voice: "alloy", output_modalities: ["audio"], turn_detection: {type: "server_vad", interrupt_response: true}, }, }; const MAX_CALL_MS = 2 * 60 * 1000; VoxEngine.addEventListener(AppEvents.Started, async () => { let call; let voiceAIClient; let hangupTimer; try { // Custom data example: {"destination":"+15551234567","callerId":"+15557654321"} const {destination, callerId} = JSON.parse(VoxEngine.customData()); call = VoxEngine.callPSTN(destination, callerId); call.addEventListener(CallEvents.Failed, () => VoxEngine.terminate()); call.addEventListener(CallEvents.Disconnected, () => { if (hangupTimer) clearTimeout(hangupTimer); VoxEngine.terminate(); }); call.addEventListener(CallEvents.Connected, async () => { hangupTimer = setTimeout(() => { Logger.write("===HANGUP_TIMER==="); call.hangup(); }, MAX_CALL_MS); voiceAIClient = await OpenAI.createRealtimeAPIClient({ apiKey: (await ApplicationStorage.get("OPENAI_API_KEY")).value, model: "gpt-realtime", onWebSocketClose: (event) => { Logger.write("===OpenAI.WebSocket.Close==="); if (event) Logger.write(JSON.stringify(event)); VoxEngine.terminate(); }, }); voiceAIClient.addEventListener(OpenAI.RealtimeAPIEvents.SessionCreated, () => { voiceAIClient.sessionUpdate(SESSION_CONFIG); }); voiceAIClient.addEventListener(OpenAI.RealtimeAPIEvents.SessionUpdated, () => { VoxEngine.sendMediaBetween(call, voiceAIClient); voiceAIClient.responseCreate({instructions: "Hello! This is Voxi. How can I help today?"}); }); voiceAIClient.addEventListener( OpenAI.RealtimeAPIEvents.InputAudioBufferSpeechStarted, () => { Logger.write("===BARGE-IN: OpenAI.InputAudioBufferSpeechStarted==="); voiceAIClient.clearMediaBuffer(); } ); // Consolidated "log-only" handlers [ OpenAI.RealtimeAPIEvents.ResponseCreated, OpenAI.RealtimeAPIEvents.ResponseDone, OpenAI.RealtimeAPIEvents.ResponseOutputAudioDone, OpenAI.RealtimeAPIEvents.ConversationItemInputAudioTranscriptionCompleted, OpenAI.RealtimeAPIEvents.ResponseOutputAudioTranscriptDone, OpenAI.RealtimeAPIEvents.ConnectorInformation, OpenAI.RealtimeAPIEvents.HTTPResponse, OpenAI.RealtimeAPIEvents.WebSocketError, OpenAI.RealtimeAPIEvents.Unknown, OpenAI.Events.WebSocketMediaStarted, OpenAI.Events.WebSocketMediaEnded, ].forEach((eventName) => { voiceAIClient.addEventListener(eventName, (event) => { Logger.write(`===${event.name}===`); if (event?.data) Logger.write(JSON.stringify(event.data)); }); }); }); } catch (error) { Logger.write("===UNHANDLED_ERROR==="); Logger.write(error); voiceAIClient?.close(); VoxEngine.terminate(); } }); ```