← Back to 2026-02-11
flasksqlalchemyanalyticsdashboardapi-development

Building Better Analytics: Adding Date Range Filtering to a Usage Dashboard

Oli·

Building Better Analytics: Adding Date Range Filtering to a Usage Dashboard

Last week, we tackled what seemed like a straightforward feature request: add date range filtering to our usage analytics dashboard. Users wanted to view their API usage statistics for specific time periods—7 days, 30 days, 90 days, or all time. Simple enough, right?

Well, as often happens in software development, what started as a "quick enhancement" turned into a deeper dive into database compatibility and API design. Here's the story of how we built it and what we learned along the way.

The Feature: From Frontend to Backend

The requirement was clear: give users the ability to filter their usage statistics by date ranges. This meant updating three main components:

1. API Endpoints with Optional Date Filtering

We enhanced our existing stats endpoints (/v1/stats/usage, /usage/by-bot, /usage/by-model) with an optional days query parameter:

@router.get("/usage")
def get_usage_stats(days: Optional[int] = None, db: Session = Depends(get_db)):
    query = db.query(UsageEvent)
    
    if days:
        cutoff_date = datetime.utcnow() - timedelta(days=days)
        query = query.filter(UsageEvent.created_at >= cutoff_date)
    
    return query.all()

2. Dashboard UI with Date Range Dropdown

On the frontend, we added a clean dropdown interface that lets users select their preferred time range:

<select id="dateRange" onchange="usagePage().load()">
    <option value="7">Last 7 days</option>
    <option value="30" selected>Last 30 days</option>
    <option value="90">Last 90 days</option>
    <option value="">All time</option>
</select>

3. Updated API Client and Postman Collection

We updated our JavaScript API client to pass the date range parameter and created a comprehensive Postman collection with pre-configured requests for different time ranges.

The Challenge: When Databases Don't Play Nice

Everything seemed to work perfectly in development with PostgreSQL. Our tests passed, the UI looked great, and the API responses were snappy. Then we ran our test suite.

Boom. TypeError: fromisoformat: argument must be str

The culprit? This innocent-looking SQLAlchemy query for grouping usage by day:

# This worked in PostgreSQL but failed in SQLite
query = db.query(
    cast(UsageEvent.created_at, Date).label('date'),
    func.count().label('count')
).group_by(cast(UsageEvent.created_at, Date))

Lessons Learned: Database Abstraction Isn't Always Abstract

Here's what we discovered: SQLite and PostgreSQL handle date casting very differently.

  • PostgreSQL: CAST(timestamp AS DATE) returns a proper date object
  • SQLite: CAST(timestamp AS DATE) uses "numeric affinity," returning an integer

When SQLAlchemy tried to process SQLite's integer result with its Date result processor, it called fromisoformat() on an integer—hence our error.

The Solution: Use Database Functions, Not Casts

Instead of relying on SQL's CAST function, we switched to using the database's native DATE() function:

# This works consistently across databases
query = db.query(
    func.date(UsageEvent.created_at).label('date'),
    func.count().label('count')
).group_by(func.date(UsageEvent.created_at))

Both SQLite and PostgreSQL implement date() functions that return string representations of dates, which SQLAlchemy handles consistently.

Testing: The Unsung Hero

Interestingly, this bug had been lurking in our codebase for a while. Our existing tests only covered the empty-state scenarios—what happens when there's no usage data. It wasn't until we added comprehensive tests with actual data that we discovered the cross-database compatibility issue.

def test_days_filter_on_all_endpoints(client, test_usage_events):
    """Test that date filtering works across all stats endpoints with real data"""
    response = client.get("/v1/stats/usage?days=7")
    assert response.status_code == 200
    # ... additional assertions

This reminded us why comprehensive test coverage matters—not just for code paths, but for different data states and environments.

The Final Result

After working through these challenges, we delivered a robust feature that includes:

  • Flexible API endpoints with optional date range filtering
  • Intuitive dashboard UI with a clean dropdown interface
  • Cross-database compatibility that works with both PostgreSQL and SQLite
  • Comprehensive test coverage including edge cases
  • Updated API documentation via Postman collection

All 86 tests pass, and users can now slice and dice their usage analytics by any time period they choose.

Looking Forward

While celebrating this feature completion, we identified a few areas for future improvement:

  1. Manual QA on production to ensure everything works as expected in the live environment
  2. Automated API testing using Newman to validate our Postman collection
  3. Configuration management for model pricing data that's currently duplicated between frontend and backend
  4. Enhanced UX by showing the selected date range in chart titles

Takeaways for Fellow Developers

  1. Test with real data: Empty-state tests are important, but don't forget to test with actual data scenarios
  2. Database functions > SQL casts: When working with multiple database engines, native functions often provide better compatibility than SQL standard features
  3. Start simple, iterate: We could have built a complex date picker, but a simple dropdown met user needs perfectly
  4. Comprehensive testing pays dividends: The extra time spent writing thorough tests saved us from production bugs

Sometimes the most valuable learning comes from features that seem straightforward but push you to understand your tools more deeply. This "simple" date range filter taught us about database compatibility, reinforced good testing practices, and ultimately made our application more robust.

What seemingly simple features have taught you the most? Share your stories in the comments below!