Quickstart
Get your first event into Axiom in under 5 minutes. This guide uses the Node.js SDK, but the same steps apply to Python, Java, and Go.
Prerequisites: Node.js 18+, an Axiom account, and an API key with write:events scope.
Step 1 — Install the SDK
npm install @axiom/sdk
Step 2 — Create a stream
Create a stream via the console or API. Streams are the top-level container for events.
curl -X POST https://api.axiom.io/v1/streams \
-H "Authorization: Bearer $AXIOM_KEY" \
-H "Content-Type: application/json" \
-d '{"name": "my-app-events", "partitions": 12, "retention_days": 30}'
Step 3 — Publish events
import { AxiomClient } from '@axiom/sdk';
const client = new AxiomClient({
apiKey: process.env.AXIOM_KEY,
region: 'us-east-1',
});
await client.streams.publish('my-app-events', [
{
timestamp: new Date().toISOString(),
user_id: 'usr_123',
event: 'page_view',
url: '/dashboard',
duration_ms: 142,
}
]);
Step 4 — Consume events
const consumer = client.streams.subscribe('my-app-events', {
groupId: 'my-consumer-group',
fromOffset: 'earliest',
});
for await (const batch of consumer) {
console.log(`Got ${batch.events.length} events`);
await batch.commit();
}
Step 5 — Query with SQL
const result = await client.query(` SELECT event, COUNT(*) as count FROM "my-app-events" WHERE _time > NOW() - INTERVAL '1 hour' GROUP BY event ORDER BY count DESC `); console.table(result.rows);
What's next?
- Full API Reference — all endpoints and options
- Set up Kafka Connect to pipe data to BigQuery or Snowflake
- Configure schema validation to enforce event contracts
- Enable geo-replication for multi-region redundancy