How to Test Your Adapter
This guide explains how to test and verify your adapter's correctness using manual CSV verification.
Prerequisites
Before testing, ensure you have:
- Redis running locally (
docker run -d -p 6379:6379 redis) - RPC URL for your target chain
- Block range to test (should be the block from the contract was deployed)
- Access to a block explorer (Etherscan, etc.)
Step 1: Create a Test Configuration
Create a minimal config file in adapters/your-adapter/tests/config/test.absinthe.json:
{
"chainArch": "evm",
"flushInterval": "1h",
"redisUrl": "redis://localhost:6379",
"sinkConfig": {
"sinks": [{ "sinkType": "csv", "path": "test-output.csv" }, { "sinkType": "stdout" }]
},
"network": {
"chainId": 1,
"gatewayUrl": "https://v2.archive.subsquid.io/network/ethereum-mainnet",
"rpcUrl": "https://your-rpc-url",
"finality": 75
},
"range": {
"fromBlock": 18000000,
"toBlock": 18001000
},
"adapterConfig": {
"adapterId": "your-adapter-name",
"config": {
"yourTrackable": [
{
"params": {
"contractAddress": "0x..."
},
"pricing": {
"kind": "pegged",
"usdPegValue": 1
}
}
]
}
}
}Testing Best Practices
| Practice | Why |
|---|---|
| Small block range | Faster iteration (1,000-5,000 blocks) |
Use pegged pricing | Eliminates price API variability |
| Include stdout sink | See events in real-time |
| CSV output | Easy to analyze and share |
Step 2: Run the Adapter
# Clear Redis state (important for clean tests)
redis-cli FLUSHALL
# Run the adapter
pnpm run dev -- --config adapters/your-adapter/tests/config/test.absinthe.jsonExpected Output
You should see:
- Config validation message
- Block processing progress
- Flush events (every
flushInterval) - CSV file creation
[INFO] Config loaded and validated
[INFO] Processing blocks 18000000-18001000...
[INFO] Block 18000100 processed
[INFO] Block 18000200 processed
...
[INFO] Flushing 42 positions to CSV...
[INFO] Complete. Output: test-output.csvStep 3: Inspect the CSV Output
Open the CSV
# View first 20 lines
head -20 test-output.csv
# Count total rows
wc -l test-output.csv
# View specific columns (user, asset, quantity)
cut -d',' -f1,3,5 test-output.csv | head -20Expected CSV Format
For Positions
user,trackable_id,asset_key,activity,quantity,quantity_basis,window_start,window_end,...
0x1234...,token-1,erc20:0xabc...,hold,1000.5,asset_amt,1700000000,1700003600,...
0x5678...,token-1,erc20:0xabc...,hold,500.25,asset_amt,1700000000,1700003600,...For Actions
user,trackable_id,asset_key,activity,quantity,quantity_basis,ts_ms,tx_ref,...
0x1234...,swap-1,erc20:0xabc...,swap,100.5,monetary_value,1700000000000,0xdef...,...Key Fields to Check
| Field | What to Verify |
|---|---|
user | Valid Ethereum address (0x..., 42 chars) |
asset_key | Matches expected format (erc20:0x..., custom:prefix) |
quantity | Non-negative, reasonable magnitude |
quantity_basis | asset_amt or monetary_value |
window_start/end | Valid Unix timestamps, end > start |
tx_ref | Valid transaction hash |
Step 4: Cross-Reference with On-Chain Data
4.1 Pick a Sample Transaction
From your CSV output, pick a specific transaction to verify:
0x1234...,swap-1,erc20:0xweth...,swap,1.5,monetary_value,1700000000000,0xabc123...4.2 Look Up the Transaction
Go to your block explorer (Etherscan, etc.) and look up the transaction:
https://etherscan.io/tx/0xabc123...4.3 Verify the Data
Check that:
| CSV Field | On-Chain Data |
|---|---|
user | Transaction from address |
asset_key | Correct token contract |
quantity | Matches event logs (after decimals) |
ts_ms | Block timestamp × 1000 |
tx_ref | Transaction hash |
Example Verification
CSV Row:0x1234abcd...,swap-1,erc20:0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2,swap,1500000000000000000,asset_amt,1700000000000,0xdef456...- Transaction:
0xdef456... - From:
0x1234abcd...✅ - Swap Event: 1.5 WETH (1.5 × 10^18 = 1500000000000000000) ✅
- Block timestamp: 1700000000 ✅
Step 5: Validate Edge Cases
For ERC20/Token Adapters
| Edge Case | What to Check |
|---|---|
| Mints | from = 0x0, positive balance delta |
| Burns | to = 0x0, negative balance delta |
| Self-transfers | Should be skipped or zero-sum |
| Contract interactions | User should be transactionFrom |
For DEX Adapters (UniV2/V3)
| Edge Case | What to Check |
|---|---|
| Same-token swaps | Should emit only one side |
| Multi-hop swaps | Each hop is separate event |
| LP mints/burns | Skip zero address |
| Flash loans | Usually same-block round-trip |
For Lending Adapters
| Edge Case | What to Check |
|---|---|
| Liquidations | Complex multi-party events |
| Interest accrual | Index changes captured |
| Self-repay | Same user supply/borrow |
Step 6: Verify Position Windows
For position-based trackables, verify that windows are correct:
Check Window Continuity
# Sort by user and window_start
sort -t',' -k1,1 -k7,7n test-output.csv | head -50Each user's windows should:
- Not overlap (previous
window_end≤ nextwindow_start) - Cover the full time range
- Have monotonic timestamps
Check Balance Consistency
For a single user, sum all balance deltas:
# Extract user 0x1234's rows and sum quantities
grep "0x1234" test-output.csv | awk -F',' '{sum += $5} END {print sum}'This should match their final balance on-chain.
Step 7: Common Issues and Fixes
Issue: No Output Generated
Symptoms: Empty CSV, no events logged
Possible Causes:- Wrong contract addresses in config
- Wrong event topics in
buildSqdProcessor - Block range has no relevant events
// Add logging to onLog
onLog: async ({ log }) => {
console.log('Received log:', log.topic0, log.address);
// ... rest of handler
}Issue: Missing Events
Symptoms: Some expected transactions not in output
Possible Causes:- Filter too narrow (missing address or topic)
- Instance filtering excludes events
- Handler skips events (validation logic)
// Log all events before filtering
if (log.topic0 === myTopic) {
console.log('Before filter:', log);
const instances = config.myTrackable.filter(...);
console.log('Matching instances:', instances.length);
}Issue: Wrong Quantities
Symptoms: Numbers don't match on-chain data
Possible Causes:- Decimal scaling issue
- Wrong field decoded
- Signed/unsigned confusion
// Log raw decoded values
const decoded = abi.events.Transfer.decode(log);
console.log('Raw value:', decoded.value.toString());
console.log('Expected decimals:', 18);
console.log('Scaled:', Number(decoded.value) / 1e18);Issue: Duplicate Events
Symptoms: Same transaction appears multiple times
Possible Causes:- Fan-out to multiple instances (expected)
- Missing deduplication key
- Handler called multiple times
Fix: Use deterministic keys for actions:
key: md5Hash(`${log.txRef}${log.index}`);Checklist: Before Shipping
- Ran on small block range — Adapter completes successfully
- Inspected CSV output — All fields look reasonable
- Cross-referenced 5+ transactions — Matches on-chain data
- Tested edge cases — Mints, burns, special addresses handled
- Verified window continuity — No gaps or overlaps
- Balance consistency check — Sums match final on-chain state
- Saved expected output — For regression testing
Getting Help
If you encounter issues:
- Check the Adapter Reference for patterns
- Compare with working adapters (ERC20, UniV2)
- Add extensive logging to narrow down the issue
- Ask in the Absinthe Community Slack with:
- Your config file
- Error messages
- Sample of expected vs actual output