Are you an LLM? Read llms.txt for a summary of the docs, or llms-full.txt for the full context.
How to Test Your Adapter – Absinthe Docs
Skip to content

How to Test Your Adapter

This guide explains how to test and verify your adapter's correctness using manual CSV verification.

Prerequisites

Before testing, ensure you have:

  • Redis running locally (docker run -d -p 6379:6379 redis)
  • RPC URL for your target chain
  • Block range to test (should be the block from the contract was deployed)
  • Access to a block explorer (Etherscan, etc.)

Step 1: Create a Test Configuration

Create a minimal config file in adapters/your-adapter/tests/config/test.absinthe.json:

{
  "chainArch": "evm",
  "flushInterval": "1h",
  "redisUrl": "redis://localhost:6379",
 
  "sinkConfig": {
    "sinks": [{ "sinkType": "csv", "path": "test-output.csv" }, { "sinkType": "stdout" }]
  },
 
  "network": {
    "chainId": 1,
    "gatewayUrl": "https://v2.archive.subsquid.io/network/ethereum-mainnet",
    "rpcUrl": "https://your-rpc-url",
    "finality": 75
  },
 
  "range": {
    "fromBlock": 18000000,
    "toBlock": 18001000
  },
 
  "adapterConfig": {
    "adapterId": "your-adapter-name",
    "config": {
      "yourTrackable": [
        {
          "params": {
            "contractAddress": "0x..."
          },
          "pricing": {
            "kind": "pegged",
            "usdPegValue": 1
          }
        }
      ]
    }
  }
}

Testing Best Practices

PracticeWhy
Small block rangeFaster iteration (1,000-5,000 blocks)
Use pegged pricingEliminates price API variability
Include stdout sinkSee events in real-time
CSV outputEasy to analyze and share

Step 2: Run the Adapter

Terminal
# Clear Redis state (important for clean tests)
redis-cli FLUSHALL
 
# Run the adapter
pnpm run dev -- --config adapters/your-adapter/tests/config/test.absinthe.json

Expected Output

You should see:

  1. Config validation message
  2. Block processing progress
  3. Flush events (every flushInterval)
  4. CSV file creation
[INFO] Config loaded and validated
[INFO] Processing blocks 18000000-18001000...
[INFO] Block 18000100 processed
[INFO] Block 18000200 processed
...
[INFO] Flushing 42 positions to CSV...
[INFO] Complete. Output: test-output.csv

Step 3: Inspect the CSV Output

Open the CSV

Terminal
# View first 20 lines
head -20 test-output.csv
 
# Count total rows
wc -l test-output.csv
 
# View specific columns (user, asset, quantity)
cut -d',' -f1,3,5 test-output.csv | head -20

Expected CSV Format

For Positions

user,trackable_id,asset_key,activity,quantity,quantity_basis,window_start,window_end,...
0x1234...,token-1,erc20:0xabc...,hold,1000.5,asset_amt,1700000000,1700003600,...
0x5678...,token-1,erc20:0xabc...,hold,500.25,asset_amt,1700000000,1700003600,...

For Actions

user,trackable_id,asset_key,activity,quantity,quantity_basis,ts_ms,tx_ref,...
0x1234...,swap-1,erc20:0xabc...,swap,100.5,monetary_value,1700000000000,0xdef...,...

Key Fields to Check

FieldWhat to Verify
userValid Ethereum address (0x..., 42 chars)
asset_keyMatches expected format (erc20:0x..., custom:prefix
)
quantityNon-negative, reasonable magnitude
quantity_basisasset_amt or monetary_value
window_start/endValid Unix timestamps, end > start
tx_refValid transaction hash

Step 4: Cross-Reference with On-Chain Data

4.1 Pick a Sample Transaction

From your CSV output, pick a specific transaction to verify:

0x1234...,swap-1,erc20:0xweth...,swap,1.5,monetary_value,1700000000000,0xabc123...

4.2 Look Up the Transaction

Go to your block explorer (Etherscan, etc.) and look up the transaction:

https://etherscan.io/tx/0xabc123...

4.3 Verify the Data

Check that:

CSV FieldOn-Chain Data
userTransaction from address
asset_keyCorrect token contract
quantityMatches event logs (after decimals)
ts_msBlock timestamp × 1000
tx_refTransaction hash

Example Verification

CSV Row:
0x1234abcd...,swap-1,erc20:0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2,swap,1500000000000000000,asset_amt,1700000000000,0xdef456...
On-Chain (Etherscan):
  • Transaction: 0xdef456...
  • From: 0x1234abcd...
  • Swap Event: 1.5 WETH (1.5 × 10^18 = 1500000000000000000) ✅
  • Block timestamp: 1700000000 ✅

Step 5: Validate Edge Cases

For ERC20/Token Adapters

Edge CaseWhat to Check
Mintsfrom = 0x0, positive balance delta
Burnsto = 0x0, negative balance delta
Self-transfersShould be skipped or zero-sum
Contract interactionsUser should be transactionFrom

For DEX Adapters (UniV2/V3)

Edge CaseWhat to Check
Same-token swapsShould emit only one side
Multi-hop swapsEach hop is separate event
LP mints/burnsSkip zero address
Flash loansUsually same-block round-trip

For Lending Adapters

Edge CaseWhat to Check
LiquidationsComplex multi-party events
Interest accrualIndex changes captured
Self-repaySame user supply/borrow

Step 6: Verify Position Windows

For position-based trackables, verify that windows are correct:

Check Window Continuity

Terminal
# Sort by user and window_start
sort -t',' -k1,1 -k7,7n test-output.csv | head -50

Each user's windows should:

  • Not overlap (previous window_end ≤ next window_start)
  • Cover the full time range
  • Have monotonic timestamps

Check Balance Consistency

For a single user, sum all balance deltas:

Terminal
# Extract user 0x1234's rows and sum quantities
grep "0x1234" test-output.csv | awk -F',' '{sum += $5} END {print sum}'

This should match their final balance on-chain.

Step 7: Common Issues and Fixes

Issue: No Output Generated

Symptoms: Empty CSV, no events logged

Possible Causes:
  1. Wrong contract addresses in config
  2. Wrong event topics in buildSqdProcessor
  3. Block range has no relevant events
Debug Steps:
// Add logging to onLog
onLog: async ({ log }) => {
  console.log('Received log:', log.topic0, log.address);
  // ... rest of handler
}

Issue: Missing Events

Symptoms: Some expected transactions not in output

Possible Causes:
  1. Filter too narrow (missing address or topic)
  2. Instance filtering excludes events
  3. Handler skips events (validation logic)
Debug Steps:
// Log all events before filtering
if (log.topic0 === myTopic) {
  console.log('Before filter:', log);
  const instances = config.myTrackable.filter(...);
  console.log('Matching instances:', instances.length);
}

Issue: Wrong Quantities

Symptoms: Numbers don't match on-chain data

Possible Causes:
  1. Decimal scaling issue
  2. Wrong field decoded
  3. Signed/unsigned confusion
Debug Steps:
// Log raw decoded values
const decoded = abi.events.Transfer.decode(log);
console.log('Raw value:', decoded.value.toString());
console.log('Expected decimals:', 18);
console.log('Scaled:', Number(decoded.value) / 1e18);

Issue: Duplicate Events

Symptoms: Same transaction appears multiple times

Possible Causes:
  1. Fan-out to multiple instances (expected)
  2. Missing deduplication key
  3. Handler called multiple times

Fix: Use deterministic keys for actions:

key: md5Hash(`${log.txRef}${log.index}`);

Checklist: Before Shipping

  • Ran on small block range — Adapter completes successfully
  • Inspected CSV output — All fields look reasonable
  • Cross-referenced 5+ transactions — Matches on-chain data
  • Tested edge cases — Mints, burns, special addresses handled
  • Verified window continuity — No gaps or overlaps
  • Balance consistency check — Sums match final on-chain state
  • Saved expected output — For regression testing

Getting Help

If you encounter issues:

  1. Check the Adapter Reference for patterns
  2. Compare with working adapters (ERC20, UniV2)
  3. Add extensive logging to narrow down the issue
  4. Ask in the Absinthe Community Slack with:
    • Your config file
    • Error messages
    • Sample of expected vs actual output