Streaming Output
Stream code execution output in real-time
Streaming Output
For long-running code executions, you can stream stdout and stderr output in real-time instead of waiting for the execution to complete. This is useful for progress indicators, live logging, and keeping users informed during heavy computations.
Basic Streaming
import { CodeInterpreter } from "vibengine";
const sandbox = await CodeInterpreter.create({
apiKey: process.env.VE_API_KEY,
});
const execution = await sandbox.runCode(
`
import time
for i in range(1, 6):
print(f"Step {i}/5: Processing batch {i}...")
time.sleep(1)
print("All batches complete.")
`,
{
onStdout: (output) => {
console.log("[stdout]", output.line);
},
onStderr: (output) => {
console.error("[stderr]", output.line);
},
}
);
await sandbox.kill();import os
from vibengine import CodeInterpreter
sandbox = CodeInterpreter.create(api_key=os.environ["VE_API_KEY"])
execution = sandbox.run_code("""
import time
for i in range(1, 6):
print(f"Step {i}/5: Processing batch {i}...")
time.sleep(1)
print("All batches complete.")
""", on_stdout=lambda output: print("[stdout]", output.line),
on_stderr=lambda output: print("[stderr]", output.line))
sandbox.kill()The callbacks fire as each line is printed, so you see output incrementally rather than all at once when the execution finishes.
Streaming Progress Updates
Streaming is especially useful for tasks that report progress, such as data processing or model training.
const execution = await sandbox.runCode(
`
import time
total = 1000
batch_size = 100
for i in range(0, total, batch_size):
processed = min(i + batch_size, total)
pct = (processed / total) * 100
print(f"Progress: {pct:.0f}% ({processed}/{total} records)")
time.sleep(0.5)
print("Processing complete. 1000 records transformed.")
`,
{
onStdout: (output) => {
// Forward to your UI, WebSocket, or SSE stream
process.stdout.write(`\r${output.line}`);
},
}
);
console.log("\nFinal result:", execution.text);execution = sandbox.run_code("""
import time
total = 1000
batch_size = 100
for i in range(0, total, batch_size):
processed = min(i + batch_size, total)
pct = (processed / total) * 100
print(f"Progress: {pct:.0f}% ({processed}/{total} records)")
time.sleep(0.5)
print("Processing complete. 1000 records transformed.")
""", on_stdout=lambda output: print(f"\r{output.line}", end="", flush=True))
print(f"\nFinal result: {execution.text}")Streaming Stderr for Warnings and Diagnostics
Many libraries write diagnostic information to stderr. Capture these separately from stdout.
const stdoutLines = [];
const stderrLines = [];
const execution = await sandbox.runCode(
`
import sys
import time
print("Starting data validation...")
# Simulate warnings on stderr
for col in ["email", "age", "name"]:
print(f"Validating column: {col}")
time.sleep(0.3)
if col == "age":
print(f"WARNING: column {col} has null values", file=sys.stderr)
print("Validation complete.")
`,
{
onStdout: (output) => {
stdoutLines.push(output.line);
console.log(" ", output.line);
},
onStderr: (output) => {
stderrLines.push(output.line);
console.warn("⚠", output.line);
},
}
);
console.log(`\nCaptured ${stdoutLines.length} stdout lines`);
console.log(`Captured ${stderrLines.length} stderr warnings`);stdout_lines = []
stderr_lines = []
def on_stdout(output):
stdout_lines.append(output.line)
print(f" {output.line}")
def on_stderr(output):
stderr_lines.append(output.line)
print(f"WARNING: {output.line}")
execution = sandbox.run_code("""
import sys
import time
print("Starting data validation...")
for col in ["email", "age", "name"]:
print(f"Validating column: {col}")
time.sleep(0.3)
if col == "age":
print(f"WARNING: column {col} has null values", file=sys.stderr)
print("Validation complete.")
""", on_stdout=on_stdout, on_stderr=on_stderr)
print(f"\nCaptured {len(stdout_lines)} stdout lines")
print(f"Captured {len(stderr_lines)} stderr warnings")Streaming with Timeouts
Combine streaming with timeouts to monitor long-running tasks and abort if they take too long.
try {
const execution = await sandbox.runCode(
`
import time
for i in range(100):
print(f"Iteration {i + 1}/100")
time.sleep(0.5)
`,
{
timeoutMs: 5000,
onStdout: (output) => {
console.log(output.line);
},
}
);
if (execution.error) {
console.log("Execution stopped:", execution.error.message);
}
} catch (error) {
console.error("Execution timed out");
}execution = sandbox.run_code("""
import time
for i in range(100):
print(f"Iteration {i + 1}/100")
time.sleep(0.5)
""", timeout=5, on_stdout=lambda output: print(output.line))
if execution.error:
print(f"Execution stopped: {execution.error.message}")Even when streaming, the final execution object still contains the complete logs.stdout and logs.stderr arrays for post-processing.
Streaming callbacks are invoked asynchronously. Avoid doing heavy computation inside the callbacks as it can slow down output delivery.