6. Best Practices & Tips
Core Principles
Validate Params with Zod
Why it matters: Catch misconfigurations early at runtime, before they cause data quality issues.
// ✅ Good: Strict validation
schema: z.object({
poolAddress: ZodEvmAddress,
trackSwaps: z.boolean().optional(),
trackLP: z.boolean().optional(),
}).refine((p) => !!p.trackSwaps || !!p.trackLP, {
message: "Must enable at least one of trackSwaps or trackLP"
})
// ❌ Bad: Loose validation
schema: z.object({
poolAddress: z.string(), // Any string could be invalid
trackSwaps: z.any(), // No type safety
})
- Early Error Detection: Fail fast on startup
- Clear Error Messages: Help users fix configuration issues
- Type Safety: Runtime type checking for all parameters
Use Redis Aggressively
Why it matters: Minimize expensive RPC calls while maintaining data freshness.
// ✅ Good: Cache pool metadata
const token0Key = `univ2:${params.poolAddress}:token0`;
const cached = await redis.get(token0Key);
if (!cached) {
const fresh = await poolContract.token0();
await redis.set(token0Key, fresh.toLowerCase());
}
// ❌ Bad: Always fetch from RPC
const token0 = await poolContract.token0(); // Expensive!
- Pool Constants: Token addresses, fee tiers (cache permanently)
- Dynamic Data: Reserves, prices (cache with TTL)
- User Data: Balances, allowances (cache with short TTL)
Lowercase Addresses
Why it matters: Prevents duplicate state caused by inconsistent address casing.
// ✅ Good: Consistent normalization
const userAddress = log.transaction.from.toLowerCase();
const tokenAddress = (await contract.token0()).toLowerCase();
// ❌ Bad: Mixed case handling
const userAddress = log.transaction.from; // Mixed case
const tokenAddress = await contract.token0(); // Mixed case
- Database Duplicates: Same user appears as different entities
- Cache Misses: Same address cached under different keys
- API Inconsistencies: Mixed case in responses
Always Emit Deterministic Keys
Why it matters: Prevents duplicate events during reprocessing or restarts.
// ✅ Good: Deterministic key generation
// ❌ Bad: Non-deterministic keys
const key = crypto.randomBytes(16).toString('hex'); // Changes every time
const key = Date.now().toString(); // Time-based, not unique
- Uniqueness: Same event always generates same key
- Stability: Doesn't change between runs
- Collision-Free: Extremely low collision probability
Performance Optimization
Separate LP vs Swap Logic
Why it matters: Allows users to enable only what they need, reducing processing overhead.
// ✅ Good: Independent toggles
if (params.trackLP && log.topics[0] === transferTopic) {
// Handle LP tracking
}
if (params.trackSwaps && log.topics[0] === swapTopic) {
// Handle swap tracking
}
// ❌ Bad: Always process everything
if (log.topics[0] === transferTopic) {
// Process even if LP tracking disabled
}
{
"poolAddress": "0x...",
"trackSwaps": true,
"trackLP": false // Skip LP processing entirely
}
Batch Operations
Why it matters: Reduce database round trips and improve throughput.
// ✅ Good: Batch event emissions
const events = [];
if (shouldEmitLP) {
events.push(lpEvent);
}
if (shouldEmitSwap) {
events.push(swapEventIn, swapEventOut);
}
await Promise.all(events.map(event => emit[event.type](event)));
// ❌ Bad: Sequential emissions
await emit.balanceDelta(lpEvent);
await emit.swap(swapEventIn);
await emit.swap(swapEventOut);
Error Handling & Resilience
Graceful Degradation
// ✅ Good: Handle RPC failures
try {
const tokenInfo = await contract.token0();
await redis.set(tokenKey, tokenInfo.toLowerCase());
} catch (error) {
console.error(`Failed to fetch token for ${params.poolAddress}:`, error);
// Continue processing - use cached value if available
}
// ❌ Bad: Crash on RPC failure
const tokenInfo = await contract.token0(); // Throws on network issues
Circuit Breakers
// ✅ Good: Prevent cascade failures
let rpcFailures = 0;
const MAX_FAILURES = 5;
try {
const result = await rpc.call();
rpcFailures = 0; // Reset on success
return result;
} catch (error) {
rpcFailures++;
if (rpcFailures >= MAX_FAILURES) {
throw new Error('RPC circuit breaker triggered');
}
// Retry with backoff or use cached data
}
Code Organization
Modular Function Structure
// ✅ Good: Single responsibility functions
async function handleLPTransfer(log: Log, params: Params) {
const { from, to, value } = decodeTransfer(log);
await emitBalanceDeltas(from, to, value, params.poolAddress);
}
async function handleSwap(log: Log, params: Params) {
const swap = decodeSwap(log);
await emitSwapEvents(swap, params.poolAddress);
}
// ❌ Bad: Monolithic handler
async function handleLog(log: Log, params: Params) {
if (log.topics[0] === transferTopic) {
// 50 lines of LP logic
} else if (log.topics[0] === swapTopic) {
// 50 lines of swap logic
}
}
Configuration-Driven Behavior
// ✅ Good: Configurable processing
const processors = {
[transferTopic]: params.trackLP ? handleLPTransfer : null,
[swapTopic]: params.trackSwaps ? handleSwap : null,
};
// ❌ Bad: Hardcoded logic
if (log.topics[0] === transferTopic) {
// Always process LP, ignore config
}
Testing & Validation
Unit Test Coverage
// ✅ Good: Test key functions
describe('handleSwap', () => {
it('emits correct events for token0->token1 swap', () => {
const log = createMockSwapLog({
amount0In: 1000000n,
amount1Out: 300000000000000000n
});
await handleSwap(log, params);
expect(emittedEvents).toHaveLength(2);
expect(emittedEvents[0].amount.asset).toBe(token0Address);
});
});
Integration Testing
// ✅ Good: End-to-end validation
describe('UniswapV2Adapter', () => {
it('processes real transaction correctly', async () => {
const adapter = new UniswapV2Adapter({
poolAddress: '0x...',
trackSwaps: true,
trackLP: true
});
await adapter.processBlock(blockWithSwapsAndTransfers);
expect(balanceDeltas).toHaveLength(expectedDeltas);
expect(swapEvents).toHaveLength(expectedSwaps);
});
});
Monitoring & Observability
Key Metrics to Track
// ✅ Good: Instrument your adapter
const metrics = {
eventsProcessed: 0,
rpcCalls: 0,
cacheHits: 0,
cacheMisses: 0,
errors: 0
};
// Log periodically
console.log(`Processed ${metrics.eventsProcessed} events, ${metrics.cacheHits}/${metrics.cacheMisses} cache hit rate`);
Migration & Maintenance
Version Management
// ✅ Good: Semantic versioning
export default registerAdapter({
name: 'uniswap-v2',
semver: '0.0.1', // Increment on breaking changes
// ...
});
Backward Compatibility
// ✅ Good: Graceful config migration
function migrateConfig(oldConfig: any): NewConfig {
return {
poolAddress: oldConfig.poolAddress,
trackSwaps: oldConfig.trackSwaps ?? true, // Default to true
trackLP: oldConfig.trackLP ?? true, // Default to true
};
}
Summary
Following these practices will help you build:
- Reliable adapters that handle edge cases gracefully
- Performant systems that scale with your data volume
- Maintainable code that's easy to debug and extend
- User-friendly configurations that are self-documenting
Remember: Simplicity first, optimization second. Start with working code, then optimize based on real performance data.