Implement comprehensive transaction fraud monitoring API
- Created FastAPI application with transaction ingestion endpoints - Built dynamic rule engine supporting velocity checks and aggregations - Implemented real-time and batch screening capabilities - Added rule management with versioning and rollback functionality - Created comprehensive audit and reporting endpoints with pagination - Set up SQLite database with proper migrations using Alembic - Added intelligent caching for aggregate computations - Included extensive API documentation and example rule definitions - Configured CORS, health endpoints, and proper error handling - Added support for time-windowed aggregations (sum, count, avg, max, min) - Built background processing for high-volume batch screening - Implemented field-agnostic rule conditions with flexible operators Features include transaction ingestion, rule CRUD operations, real-time screening, batch processing, aggregation computations, and comprehensive reporting capabilities suitable for fintech fraud monitoring systems.
This commit is contained in:
parent
e2a883beb6
commit
4730c37915
371
README.md
371
README.md
@ -1,3 +1,370 @@
|
|||||||
# FastAPI Application
|
# Transaction Fraud Monitoring API
|
||||||
|
|
||||||
This is a FastAPI application bootstrapped by BackendIM, the AI-powered backend generation platform.
|
A comprehensive API-driven transaction monitoring system designed to screen transactions against dynamic fraud rules in real-time. The system supports velocity checks, aggregation computations, rule management, and provides both real-time and batch screening capabilities.
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
- **Transaction Ingestion**: Bulk and individual transaction ingestion
|
||||||
|
- **Dynamic Rule Engine**: Create and manage fraud detection rules with aggregation support
|
||||||
|
- **Real-time Screening**: Instant transaction screening against active rules
|
||||||
|
- **Batch Processing**: Background batch screening for historical data analysis
|
||||||
|
- **Aggregation Support**: Velocity checks, sum, count, average computations over time windows
|
||||||
|
- **Rule Versioning**: Complete audit trail with rule rollback capabilities
|
||||||
|
- **Caching**: Intelligent caching for aggregate computations to improve performance
|
||||||
|
- **Comprehensive Reporting**: Detailed screening results with pagination and filtering
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
### Core Components
|
||||||
|
|
||||||
|
1. **Rule Engine**: Evaluates transactions against configurable rules with support for:
|
||||||
|
- Field comparisons (eq, ne, gt, gte, lt, lte, in, contains)
|
||||||
|
- Aggregate functions (sum, count, avg, max, min)
|
||||||
|
- Time-based windowing (24h, 7d, 30d)
|
||||||
|
- Group-by operations (user_id, account_id, device_id)
|
||||||
|
|
||||||
|
2. **Caching Layer**: Optimizes aggregate computations with intelligent cache expiration
|
||||||
|
|
||||||
|
3. **Background Processing**: Asynchronous batch screening for high-volume processing
|
||||||
|
|
||||||
|
4. **Audit Trail**: Complete history of rule changes and screening results
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
1. Install dependencies:
|
||||||
|
```bash
|
||||||
|
pip install -r requirements.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Run database migrations:
|
||||||
|
```bash
|
||||||
|
alembic upgrade head
|
||||||
|
```
|
||||||
|
|
||||||
|
3. Start the application:
|
||||||
|
```bash
|
||||||
|
uvicorn main:app --host 0.0.0.0 --port 8000
|
||||||
|
```
|
||||||
|
|
||||||
|
## Environment Variables
|
||||||
|
|
||||||
|
The following environment variables can be configured:
|
||||||
|
|
||||||
|
- `SECRET_KEY`: Secret key for API security (default: "your-secret-key-here")
|
||||||
|
- `DATABASE_URL`: Database connection URL (optional, uses SQLite by default)
|
||||||
|
|
||||||
|
## API Endpoints
|
||||||
|
|
||||||
|
### Transaction Management
|
||||||
|
|
||||||
|
#### POST /api/v1/transactions/
|
||||||
|
Ingest a single transaction.
|
||||||
|
|
||||||
|
**Request Body:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"transaction_id": "txn_12345",
|
||||||
|
"user_id": "user_123",
|
||||||
|
"account_id": "acc_456",
|
||||||
|
"amount": 50000.00,
|
||||||
|
"currency": "NGN",
|
||||||
|
"transaction_type": "debit",
|
||||||
|
"merchant_id": "merchant_789",
|
||||||
|
"merchant_category": "grocery",
|
||||||
|
"channel": "mobile",
|
||||||
|
"location": "Lagos, Nigeria",
|
||||||
|
"ip_address": "192.168.1.1",
|
||||||
|
"device_id": "device_abc",
|
||||||
|
"status": "pending",
|
||||||
|
"metadata": {
|
||||||
|
"app_version": "2.1.0",
|
||||||
|
"user_agent": "Mobile App"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### POST /api/v1/transactions/bulk
|
||||||
|
Ingest multiple transactions in bulk.
|
||||||
|
|
||||||
|
#### GET /api/v1/transactions/
|
||||||
|
Retrieve transactions with filtering and pagination.
|
||||||
|
|
||||||
|
**Query Parameters:**
|
||||||
|
- `page`: Page number (default: 1)
|
||||||
|
- `page_size`: Items per page (default: 100, max: 1000)
|
||||||
|
- `user_id`: Filter by user ID
|
||||||
|
- `account_id`: Filter by account ID
|
||||||
|
- `transaction_type`: Filter by transaction type
|
||||||
|
- `channel`: Filter by channel
|
||||||
|
- `status`: Filter by status
|
||||||
|
- `min_amount`: Minimum amount filter
|
||||||
|
- `max_amount`: Maximum amount filter
|
||||||
|
|
||||||
|
### Rule Management
|
||||||
|
|
||||||
|
#### POST /api/v1/rules/
|
||||||
|
Create a new fraud detection rule.
|
||||||
|
|
||||||
|
**Example Rule for Velocity Check:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"name": "High Velocity Transaction Check",
|
||||||
|
"description": "Flag if user has more than 10 transactions > ₦100,000 in 24 hours",
|
||||||
|
"rule_type": "velocity",
|
||||||
|
"conditions": [
|
||||||
|
{
|
||||||
|
"field": "amount",
|
||||||
|
"operator": "gt",
|
||||||
|
"value": 100000,
|
||||||
|
"aggregate_function": "count",
|
||||||
|
"time_window": "24h",
|
||||||
|
"group_by": ["user_id"]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"actions": [
|
||||||
|
{
|
||||||
|
"action_type": "flag",
|
||||||
|
"parameters": {
|
||||||
|
"risk_score": 80,
|
||||||
|
"reason": "High velocity detected"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"priority": 1,
|
||||||
|
"is_active": true
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Example Rule for Amount Limit:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"name": "Large Single Transaction",
|
||||||
|
"description": "Flag transactions over ₦500,000",
|
||||||
|
"rule_type": "amount_limit",
|
||||||
|
"conditions": [
|
||||||
|
{
|
||||||
|
"field": "amount",
|
||||||
|
"operator": "gt",
|
||||||
|
"value": 500000
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"actions": [
|
||||||
|
{
|
||||||
|
"action_type": "flag",
|
||||||
|
"parameters": {
|
||||||
|
"risk_score": 90,
|
||||||
|
"reason": "Large transaction amount"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"priority": 2,
|
||||||
|
"is_active": true
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Example Rule for Sum-based Velocity:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"name": "Daily Spending Limit",
|
||||||
|
"description": "Flag if sum of transactions from same user > ₦500,000 within 24 hours",
|
||||||
|
"rule_type": "velocity",
|
||||||
|
"conditions": [
|
||||||
|
{
|
||||||
|
"field": "amount",
|
||||||
|
"operator": "gt",
|
||||||
|
"value": 500000,
|
||||||
|
"aggregate_function": "sum",
|
||||||
|
"time_window": "24h",
|
||||||
|
"group_by": ["user_id"]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"actions": [
|
||||||
|
{
|
||||||
|
"action_type": "flag",
|
||||||
|
"parameters": {
|
||||||
|
"risk_score": 85,
|
||||||
|
"reason": "Daily spending limit exceeded"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"priority": 1,
|
||||||
|
"is_active": true
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### GET /api/v1/rules/
|
||||||
|
Retrieve all rules with filtering and pagination.
|
||||||
|
|
||||||
|
#### PUT /api/v1/rules/{rule_id}
|
||||||
|
Update an existing rule (creates new version).
|
||||||
|
|
||||||
|
#### POST /api/v1/rules/{rule_id}/rollback/{version}
|
||||||
|
Rollback a rule to a previous version.
|
||||||
|
|
||||||
|
### Screening
|
||||||
|
|
||||||
|
#### POST /api/v1/screening/transactions/{transaction_id}
|
||||||
|
Screen a single transaction in real-time.
|
||||||
|
|
||||||
|
**Response:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"transaction_id": "txn_12345",
|
||||||
|
"results": [
|
||||||
|
{
|
||||||
|
"id": 1,
|
||||||
|
"transaction_id": "txn_12345",
|
||||||
|
"rule_id": 1,
|
||||||
|
"rule_name": "High Velocity Transaction Check",
|
||||||
|
"rule_version": 1,
|
||||||
|
"status": "flagged",
|
||||||
|
"risk_score": 80.0,
|
||||||
|
"details": {
|
||||||
|
"rule_triggered": true,
|
||||||
|
"conditions_met": 1,
|
||||||
|
"evaluation_time_ms": 45.2,
|
||||||
|
"actions": [...]
|
||||||
|
},
|
||||||
|
"aggregated_data": {
|
||||||
|
"value": 12,
|
||||||
|
"function": "count",
|
||||||
|
"field": "amount",
|
||||||
|
"time_window": "24h",
|
||||||
|
"group_by": ["user_id"]
|
||||||
|
},
|
||||||
|
"screening_type": "real_time",
|
||||||
|
"created_at": "2024-01-01T12:00:00Z"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"overall_status": "flagged",
|
||||||
|
"total_risk_score": 80.0,
|
||||||
|
"screening_duration_ms": 67.8
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### POST /api/v1/screening/batch
|
||||||
|
Create a batch screening job.
|
||||||
|
|
||||||
|
#### GET /api/v1/screening/batch
|
||||||
|
Retrieve batch screening jobs.
|
||||||
|
|
||||||
|
#### GET /api/v1/screening/results
|
||||||
|
Retrieve screening results with audit capabilities.
|
||||||
|
|
||||||
|
### Aggregation
|
||||||
|
|
||||||
|
#### POST /api/v1/screening/aggregate
|
||||||
|
Compute aggregate values with caching.
|
||||||
|
|
||||||
|
**Request Example:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"aggregate_function": "sum",
|
||||||
|
"field": "amount",
|
||||||
|
"group_by": ["user_id"],
|
||||||
|
"filters": {
|
||||||
|
"transaction_type": "debit"
|
||||||
|
},
|
||||||
|
"time_window": "24h"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Rule Definition Format
|
||||||
|
|
||||||
|
Rules are defined using a flexible JSON structure:
|
||||||
|
|
||||||
|
### Condition Fields
|
||||||
|
- `field`: Transaction field to evaluate
|
||||||
|
- `operator`: Comparison operator (eq, ne, gt, gte, lt, lte, in, not_in, contains, starts_with, ends_with)
|
||||||
|
- `value`: Value to compare against
|
||||||
|
- `aggregate_function`: Optional aggregation (sum, count, avg, max, min)
|
||||||
|
- `time_window`: Time window for aggregation (1h, 24h, 7d, 30d)
|
||||||
|
- `group_by`: Fields to group by for aggregation
|
||||||
|
|
||||||
|
### Action Types
|
||||||
|
- `flag`: Mark transaction as suspicious
|
||||||
|
- `block`: Block transaction (highest risk)
|
||||||
|
- `alert`: Send alert notification
|
||||||
|
- `score`: Assign custom risk score
|
||||||
|
|
||||||
|
### Time Windows
|
||||||
|
- `1h`, `2h`, `24h`: Hours
|
||||||
|
- `1d`, `7d`, `30d`: Days
|
||||||
|
- `15m`, `30m`: Minutes
|
||||||
|
|
||||||
|
## Background Jobs
|
||||||
|
|
||||||
|
For high-volume processing, use batch screening:
|
||||||
|
|
||||||
|
1. Create batch job with filters
|
||||||
|
2. Monitor job progress via batch endpoints
|
||||||
|
3. Retrieve results when completed
|
||||||
|
|
||||||
|
Batch jobs run asynchronously to avoid blocking the API.
|
||||||
|
|
||||||
|
## Performance Considerations
|
||||||
|
|
||||||
|
- **Caching**: Aggregate computations are cached based on time windows
|
||||||
|
- **Indexing**: Database indexes on key fields (user_id, account_id, device_id, created_at)
|
||||||
|
- **Background Processing**: Batch jobs prevent API blocking
|
||||||
|
- **Pagination**: All list endpoints support pagination
|
||||||
|
|
||||||
|
## Security
|
||||||
|
|
||||||
|
- CORS enabled for all origins (configure for production)
|
||||||
|
- Request validation using Pydantic schemas
|
||||||
|
- SQLAlchemy ORM prevents SQL injection
|
||||||
|
- Environment variable configuration for secrets
|
||||||
|
|
||||||
|
## Development
|
||||||
|
|
||||||
|
Run the development server:
|
||||||
|
```bash
|
||||||
|
uvicorn main:app --reload --host 0.0.0.0 --port 8000
|
||||||
|
```
|
||||||
|
|
||||||
|
Run linting:
|
||||||
|
```bash
|
||||||
|
ruff check . --fix
|
||||||
|
```
|
||||||
|
|
||||||
|
## API Documentation
|
||||||
|
|
||||||
|
- Interactive API docs: http://localhost:8000/docs
|
||||||
|
- ReDoc documentation: http://localhost:8000/redoc
|
||||||
|
- OpenAPI JSON: http://localhost:8000/openapi.json
|
||||||
|
|
||||||
|
## Health Check
|
||||||
|
|
||||||
|
- Health endpoint: http://localhost:8000/health
|
||||||
|
- Base info: http://localhost:8000/
|
||||||
|
|
||||||
|
## Database
|
||||||
|
|
||||||
|
The system uses SQLite by default with the database file stored at `/app/storage/db/db.sqlite`. The database includes:
|
||||||
|
|
||||||
|
- **transactions**: Core transaction data
|
||||||
|
- **rules**: Fraud detection rules with versioning
|
||||||
|
- **rule_versions**: Complete audit trail of rule changes
|
||||||
|
- **screening_results**: Individual screening outcomes
|
||||||
|
- **screening_batches**: Batch job management
|
||||||
|
- **aggregate_cache**: Performance optimization cache
|
||||||
|
|
||||||
|
## Example Workflows
|
||||||
|
|
||||||
|
### 1. Basic Setup
|
||||||
|
1. Create rules using POST /api/v1/rules/
|
||||||
|
2. Ingest transactions using POST /api/v1/transactions/
|
||||||
|
3. Screen transactions using POST /api/v1/screening/transactions/{id}
|
||||||
|
|
||||||
|
### 2. Batch Processing
|
||||||
|
1. Ingest historical transactions
|
||||||
|
2. Create batch screening job
|
||||||
|
3. Monitor batch progress
|
||||||
|
4. Retrieve and analyze results
|
||||||
|
|
||||||
|
### 3. Rule Management
|
||||||
|
1. Create initial rules
|
||||||
|
2. Monitor screening results
|
||||||
|
3. Update rules based on performance
|
||||||
|
4. Use versioning for audit compliance
|
41
alembic.ini
Normal file
41
alembic.ini
Normal file
@ -0,0 +1,41 @@
|
|||||||
|
[alembic]
|
||||||
|
script_location = alembic
|
||||||
|
prepend_sys_path = .
|
||||||
|
version_path_separator = os
|
||||||
|
sqlalchemy.url = sqlite:////app/storage/db/db.sqlite
|
||||||
|
|
||||||
|
[post_write_hooks]
|
||||||
|
|
||||||
|
[loggers]
|
||||||
|
keys = root,sqlalchemy,alembic
|
||||||
|
|
||||||
|
[handlers]
|
||||||
|
keys = console
|
||||||
|
|
||||||
|
[formatters]
|
||||||
|
keys = generic
|
||||||
|
|
||||||
|
[logger_root]
|
||||||
|
level = WARN
|
||||||
|
handlers = console
|
||||||
|
qualname =
|
||||||
|
|
||||||
|
[logger_sqlalchemy]
|
||||||
|
level = WARN
|
||||||
|
handlers =
|
||||||
|
qualname = sqlalchemy.engine
|
||||||
|
|
||||||
|
[logger_alembic]
|
||||||
|
level = INFO
|
||||||
|
handlers =
|
||||||
|
qualname = alembic
|
||||||
|
|
||||||
|
[handler_console]
|
||||||
|
class = StreamHandler
|
||||||
|
args = (sys.stderr,)
|
||||||
|
level = NOTSET
|
||||||
|
formatter = generic
|
||||||
|
|
||||||
|
[formatter_generic]
|
||||||
|
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||||
|
datefmt = %H:%M:%S
|
51
alembic/env.py
Normal file
51
alembic/env.py
Normal file
@ -0,0 +1,51 @@
|
|||||||
|
from logging.config import fileConfig
|
||||||
|
from sqlalchemy import engine_from_config
|
||||||
|
from sqlalchemy import pool
|
||||||
|
from alembic import context
|
||||||
|
import sys
|
||||||
|
import os
|
||||||
|
|
||||||
|
sys.path.append(os.path.dirname(os.path.dirname(__file__)))
|
||||||
|
|
||||||
|
from app.db.base import Base
|
||||||
|
|
||||||
|
config = context.config
|
||||||
|
|
||||||
|
if config.config_file_name is not None:
|
||||||
|
fileConfig(config.config_file_name)
|
||||||
|
|
||||||
|
target_metadata = Base.metadata
|
||||||
|
|
||||||
|
def run_migrations_offline() -> None:
|
||||||
|
url = config.get_main_option("sqlalchemy.url")
|
||||||
|
context.configure(
|
||||||
|
url=url,
|
||||||
|
target_metadata=target_metadata,
|
||||||
|
literal_binds=True,
|
||||||
|
dialect_opts={"paramstyle": "named"},
|
||||||
|
)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_online() -> None:
|
||||||
|
connectable = engine_from_config(
|
||||||
|
config.get_section(config.config_ini_section),
|
||||||
|
prefix="sqlalchemy.",
|
||||||
|
poolclass=pool.NullPool,
|
||||||
|
)
|
||||||
|
|
||||||
|
with connectable.connect() as connection:
|
||||||
|
context.configure(
|
||||||
|
connection=connection, target_metadata=target_metadata
|
||||||
|
)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
if context.is_offline_mode():
|
||||||
|
run_migrations_offline()
|
||||||
|
else:
|
||||||
|
run_migrations_online()
|
24
alembic/script.py.mako
Normal file
24
alembic/script.py.mako
Normal file
@ -0,0 +1,24 @@
|
|||||||
|
"""${message}
|
||||||
|
|
||||||
|
Revision ID: ${up_revision}
|
||||||
|
Revises: ${down_revision | comma,n}
|
||||||
|
Create Date: ${create_date}
|
||||||
|
|
||||||
|
"""
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
${imports if imports else ""}
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = ${repr(up_revision)}
|
||||||
|
down_revision = ${repr(down_revision)}
|
||||||
|
branch_labels = ${repr(branch_labels)}
|
||||||
|
depends_on = ${repr(depends_on)}
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
${upgrades if upgrades else "pass"}
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
${downgrades if downgrades else "pass"}
|
141
alembic/versions/001_initial_migration.py
Normal file
141
alembic/versions/001_initial_migration.py
Normal file
@ -0,0 +1,141 @@
|
|||||||
|
"""Initial migration
|
||||||
|
|
||||||
|
Revision ID: 001
|
||||||
|
Revises:
|
||||||
|
Create Date: 2024-01-01 12:00:00.000000
|
||||||
|
|
||||||
|
"""
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '001'
|
||||||
|
down_revision = None
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
# Create transactions table
|
||||||
|
op.create_table('transactions',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('transaction_id', sa.String(), nullable=False),
|
||||||
|
sa.Column('user_id', sa.String(), nullable=False),
|
||||||
|
sa.Column('account_id', sa.String(), nullable=False),
|
||||||
|
sa.Column('amount', sa.Float(), nullable=False),
|
||||||
|
sa.Column('currency', sa.String(), nullable=True),
|
||||||
|
sa.Column('transaction_type', sa.String(), nullable=False),
|
||||||
|
sa.Column('merchant_id', sa.String(), nullable=True),
|
||||||
|
sa.Column('merchant_category', sa.String(), nullable=True),
|
||||||
|
sa.Column('channel', sa.String(), nullable=False),
|
||||||
|
sa.Column('location', sa.String(), nullable=True),
|
||||||
|
sa.Column('ip_address', sa.String(), nullable=True),
|
||||||
|
sa.Column('device_id', sa.String(), nullable=True),
|
||||||
|
sa.Column('status', sa.String(), nullable=True),
|
||||||
|
sa.Column('metadata', sa.Text(), nullable=True),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('(CURRENT_TIMESTAMP)'), nullable=True),
|
||||||
|
sa.Column('updated_at', sa.DateTime(timezone=True), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_transactions_id'), 'transactions', ['id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_transactions_transaction_id'), 'transactions', ['transaction_id'], unique=True)
|
||||||
|
op.create_index(op.f('ix_transactions_user_id'), 'transactions', ['user_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_transactions_account_id'), 'transactions', ['account_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_transactions_merchant_id'), 'transactions', ['merchant_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_transactions_device_id'), 'transactions', ['device_id'], unique=False)
|
||||||
|
|
||||||
|
# Create rules table
|
||||||
|
op.create_table('rules',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('name', sa.String(), nullable=False),
|
||||||
|
sa.Column('description', sa.Text(), nullable=True),
|
||||||
|
sa.Column('rule_type', sa.String(), nullable=False),
|
||||||
|
sa.Column('conditions', sa.Text(), nullable=False),
|
||||||
|
sa.Column('actions', sa.Text(), nullable=False),
|
||||||
|
sa.Column('priority', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('is_active', sa.Boolean(), nullable=True),
|
||||||
|
sa.Column('version', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('created_by', sa.String(), nullable=True),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('(CURRENT_TIMESTAMP)'), nullable=True),
|
||||||
|
sa.Column('updated_at', sa.DateTime(timezone=True), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_rules_id'), 'rules', ['id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_rules_name'), 'rules', ['name'], unique=True)
|
||||||
|
|
||||||
|
# Create rule_versions table
|
||||||
|
op.create_table('rule_versions',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('rule_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('version', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('name', sa.String(), nullable=False),
|
||||||
|
sa.Column('description', sa.Text(), nullable=True),
|
||||||
|
sa.Column('rule_type', sa.String(), nullable=False),
|
||||||
|
sa.Column('conditions', sa.Text(), nullable=False),
|
||||||
|
sa.Column('actions', sa.Text(), nullable=False),
|
||||||
|
sa.Column('priority', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('created_by', sa.String(), nullable=True),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('(CURRENT_TIMESTAMP)'), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_rule_versions_id'), 'rule_versions', ['id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_rule_versions_rule_id'), 'rule_versions', ['rule_id'], unique=False)
|
||||||
|
|
||||||
|
# Create screening_results table
|
||||||
|
op.create_table('screening_results',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('transaction_id', sa.String(), nullable=False),
|
||||||
|
sa.Column('rule_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('rule_name', sa.String(), nullable=False),
|
||||||
|
sa.Column('rule_version', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('status', sa.String(), nullable=False),
|
||||||
|
sa.Column('risk_score', sa.Float(), nullable=True),
|
||||||
|
sa.Column('details', sa.Text(), nullable=True),
|
||||||
|
sa.Column('aggregated_data', sa.Text(), nullable=True),
|
||||||
|
sa.Column('screening_type', sa.String(), nullable=True),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('(CURRENT_TIMESTAMP)'), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_screening_results_id'), 'screening_results', ['id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_screening_results_transaction_id'), 'screening_results', ['transaction_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_screening_results_rule_id'), 'screening_results', ['rule_id'], unique=False)
|
||||||
|
|
||||||
|
# Create screening_batches table
|
||||||
|
op.create_table('screening_batches',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('batch_id', sa.String(), nullable=False),
|
||||||
|
sa.Column('name', sa.String(), nullable=True),
|
||||||
|
sa.Column('description', sa.Text(), nullable=True),
|
||||||
|
sa.Column('status', sa.String(), nullable=True),
|
||||||
|
sa.Column('total_transactions', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('processed_transactions', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('flagged_transactions', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('rules_applied', sa.Text(), nullable=True),
|
||||||
|
sa.Column('started_at', sa.DateTime(timezone=True), nullable=True),
|
||||||
|
sa.Column('completed_at', sa.DateTime(timezone=True), nullable=True),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('(CURRENT_TIMESTAMP)'), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_screening_batches_id'), 'screening_batches', ['id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_screening_batches_batch_id'), 'screening_batches', ['batch_id'], unique=True)
|
||||||
|
|
||||||
|
# Create aggregate_cache table
|
||||||
|
op.create_table('aggregate_cache',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('cache_key', sa.String(), nullable=False),
|
||||||
|
sa.Column('cache_value', sa.Text(), nullable=False),
|
||||||
|
sa.Column('expires_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('(CURRENT_TIMESTAMP)'), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_aggregate_cache_id'), 'aggregate_cache', ['id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_aggregate_cache_cache_key'), 'aggregate_cache', ['cache_key'], unique=True)
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
op.drop_table('aggregate_cache')
|
||||||
|
op.drop_table('screening_batches')
|
||||||
|
op.drop_table('screening_results')
|
||||||
|
op.drop_table('rule_versions')
|
||||||
|
op.drop_table('rules')
|
||||||
|
op.drop_table('transactions')
|
0
app/__init__.py
Normal file
0
app/__init__.py
Normal file
0
app/api/__init__.py
Normal file
0
app/api/__init__.py
Normal file
22
app/api/routes.py
Normal file
22
app/api/routes.py
Normal file
@ -0,0 +1,22 @@
|
|||||||
|
from fastapi import APIRouter
|
||||||
|
from app.api import transactions, rules, screening
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
router.include_router(
|
||||||
|
transactions.router,
|
||||||
|
prefix="/transactions",
|
||||||
|
tags=["transactions"]
|
||||||
|
)
|
||||||
|
|
||||||
|
router.include_router(
|
||||||
|
rules.router,
|
||||||
|
prefix="/rules",
|
||||||
|
tags=["rules"]
|
||||||
|
)
|
||||||
|
|
||||||
|
router.include_router(
|
||||||
|
screening.router,
|
||||||
|
prefix="/screening",
|
||||||
|
tags=["screening"]
|
||||||
|
)
|
322
app/api/rules.py
Normal file
322
app/api/rules.py
Normal file
@ -0,0 +1,322 @@
|
|||||||
|
from typing import Optional
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, Query
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy import desc
|
||||||
|
import json
|
||||||
|
|
||||||
|
from app.db.session import get_db
|
||||||
|
from app.models.rule import Rule, RuleVersion
|
||||||
|
from app.core.schemas import RuleCreate, RuleUpdate, Rule as RuleSchema, PaginatedResponse
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
@router.post("/", response_model=RuleSchema)
|
||||||
|
async def create_rule(
|
||||||
|
rule: RuleCreate,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Create a new fraud detection rule.
|
||||||
|
|
||||||
|
Rules define conditions and actions for transaction screening.
|
||||||
|
Each rule can include aggregation conditions for velocity checks.
|
||||||
|
|
||||||
|
Example rule for velocity check:
|
||||||
|
{
|
||||||
|
"name": "High Velocity Transaction Check",
|
||||||
|
"description": "Flag if user has more than 10 transactions > ₦100,000 in 24 hours",
|
||||||
|
"rule_type": "velocity",
|
||||||
|
"conditions": [
|
||||||
|
{
|
||||||
|
"field": "amount",
|
||||||
|
"operator": "gt",
|
||||||
|
"value": 100000,
|
||||||
|
"aggregate_function": "count",
|
||||||
|
"time_window": "24h",
|
||||||
|
"group_by": ["user_id"]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"actions": [
|
||||||
|
{
|
||||||
|
"action_type": "flag",
|
||||||
|
"parameters": {"risk_score": 80, "reason": "High velocity detected"}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"priority": 1
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
# Check if rule name already exists
|
||||||
|
existing = db.query(Rule).filter(Rule.name == rule.name).first()
|
||||||
|
if existing:
|
||||||
|
raise HTTPException(status_code=400, detail="Rule with this name already exists")
|
||||||
|
|
||||||
|
db_rule = Rule(
|
||||||
|
name=rule.name,
|
||||||
|
description=rule.description,
|
||||||
|
rule_type=rule.rule_type,
|
||||||
|
conditions=json.dumps([condition.dict() for condition in rule.conditions]),
|
||||||
|
actions=json.dumps([action.dict() for action in rule.actions]),
|
||||||
|
priority=rule.priority,
|
||||||
|
is_active=rule.is_active,
|
||||||
|
version=1,
|
||||||
|
created_by=rule.created_by
|
||||||
|
)
|
||||||
|
|
||||||
|
db.add(db_rule)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(db_rule)
|
||||||
|
|
||||||
|
# Also save to rule versions for audit trail
|
||||||
|
rule_version = RuleVersion(
|
||||||
|
rule_id=db_rule.id,
|
||||||
|
version=1,
|
||||||
|
name=db_rule.name,
|
||||||
|
description=db_rule.description,
|
||||||
|
rule_type=db_rule.rule_type,
|
||||||
|
conditions=db_rule.conditions,
|
||||||
|
actions=db_rule.actions,
|
||||||
|
priority=db_rule.priority,
|
||||||
|
created_by=db_rule.created_by
|
||||||
|
)
|
||||||
|
db.add(rule_version)
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
# Convert to response format
|
||||||
|
result = RuleSchema.from_orm(db_rule)
|
||||||
|
result.conditions = json.loads(db_rule.conditions)
|
||||||
|
result.actions = json.loads(db_rule.actions)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
@router.get("/", response_model=PaginatedResponse)
|
||||||
|
async def get_rules(
|
||||||
|
page: int = Query(1, ge=1),
|
||||||
|
page_size: int = Query(100, ge=1, le=1000),
|
||||||
|
rule_type: Optional[str] = None,
|
||||||
|
is_active: Optional[bool] = None,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Retrieve all fraud detection rules with filtering and pagination.
|
||||||
|
"""
|
||||||
|
query = db.query(Rule)
|
||||||
|
|
||||||
|
# Apply filters
|
||||||
|
if rule_type:
|
||||||
|
query = query.filter(Rule.rule_type == rule_type)
|
||||||
|
if is_active is not None:
|
||||||
|
query = query.filter(Rule.is_active == is_active)
|
||||||
|
|
||||||
|
# Get total count
|
||||||
|
total = query.count()
|
||||||
|
|
||||||
|
# Apply pagination
|
||||||
|
offset = (page - 1) * page_size
|
||||||
|
rules = query.order_by(desc(Rule.priority), desc(Rule.created_at)).offset(offset).limit(page_size).all()
|
||||||
|
|
||||||
|
# Convert to response format
|
||||||
|
items = []
|
||||||
|
for rule in rules:
|
||||||
|
result = RuleSchema.from_orm(rule)
|
||||||
|
result.conditions = json.loads(rule.conditions)
|
||||||
|
result.actions = json.loads(rule.actions)
|
||||||
|
items.append(result)
|
||||||
|
|
||||||
|
return PaginatedResponse(
|
||||||
|
items=items,
|
||||||
|
total=total,
|
||||||
|
page=page,
|
||||||
|
page_size=page_size,
|
||||||
|
total_pages=(total + page_size - 1) // page_size
|
||||||
|
)
|
||||||
|
|
||||||
|
@router.get("/{rule_id}", response_model=RuleSchema)
|
||||||
|
async def get_rule(
|
||||||
|
rule_id: int,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Retrieve a specific rule by ID.
|
||||||
|
"""
|
||||||
|
rule = db.query(Rule).filter(Rule.id == rule_id).first()
|
||||||
|
if not rule:
|
||||||
|
raise HTTPException(status_code=404, detail="Rule not found")
|
||||||
|
|
||||||
|
result = RuleSchema.from_orm(rule)
|
||||||
|
result.conditions = json.loads(rule.conditions)
|
||||||
|
result.actions = json.loads(rule.actions)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
@router.put("/{rule_id}", response_model=RuleSchema)
|
||||||
|
async def update_rule(
|
||||||
|
rule_id: int,
|
||||||
|
rule_update: RuleUpdate,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Update an existing rule and create a new version.
|
||||||
|
|
||||||
|
This endpoint creates a new version of the rule for audit purposes
|
||||||
|
while updating the main rule record.
|
||||||
|
"""
|
||||||
|
rule = db.query(Rule).filter(Rule.id == rule_id).first()
|
||||||
|
if not rule:
|
||||||
|
raise HTTPException(status_code=404, detail="Rule not found")
|
||||||
|
|
||||||
|
# Save current version to rule_versions before updating
|
||||||
|
current_version = RuleVersion(
|
||||||
|
rule_id=rule.id,
|
||||||
|
version=rule.version,
|
||||||
|
name=rule.name,
|
||||||
|
description=rule.description,
|
||||||
|
rule_type=rule.rule_type,
|
||||||
|
conditions=rule.conditions,
|
||||||
|
actions=rule.actions,
|
||||||
|
priority=rule.priority,
|
||||||
|
created_by=rule.created_by
|
||||||
|
)
|
||||||
|
db.add(current_version)
|
||||||
|
|
||||||
|
# Update rule with new values
|
||||||
|
if rule_update.name is not None:
|
||||||
|
# Check if new name conflicts with existing rules
|
||||||
|
existing = db.query(Rule).filter(Rule.name == rule_update.name, Rule.id != rule_id).first()
|
||||||
|
if existing:
|
||||||
|
raise HTTPException(status_code=400, detail="Rule with this name already exists")
|
||||||
|
rule.name = rule_update.name
|
||||||
|
|
||||||
|
if rule_update.description is not None:
|
||||||
|
rule.description = rule_update.description
|
||||||
|
|
||||||
|
if rule_update.conditions is not None:
|
||||||
|
rule.conditions = json.dumps([condition.dict() for condition in rule_update.conditions])
|
||||||
|
|
||||||
|
if rule_update.actions is not None:
|
||||||
|
rule.actions = json.dumps([action.dict() for action in rule_update.actions])
|
||||||
|
|
||||||
|
if rule_update.priority is not None:
|
||||||
|
rule.priority = rule_update.priority
|
||||||
|
|
||||||
|
if rule_update.is_active is not None:
|
||||||
|
rule.is_active = rule_update.is_active
|
||||||
|
|
||||||
|
# Increment version
|
||||||
|
rule.version += 1
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
db.refresh(rule)
|
||||||
|
|
||||||
|
# Convert to response format
|
||||||
|
result = RuleSchema.from_orm(rule)
|
||||||
|
result.conditions = json.loads(rule.conditions)
|
||||||
|
result.actions = json.loads(rule.actions)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
@router.delete("/{rule_id}")
|
||||||
|
async def delete_rule(
|
||||||
|
rule_id: int,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Soft delete a rule by marking it as inactive.
|
||||||
|
|
||||||
|
Rules are not permanently deleted to maintain audit trail.
|
||||||
|
"""
|
||||||
|
rule = db.query(Rule).filter(Rule.id == rule_id).first()
|
||||||
|
if not rule:
|
||||||
|
raise HTTPException(status_code=404, detail="Rule not found")
|
||||||
|
|
||||||
|
rule.is_active = False
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
return {"message": "Rule deactivated successfully"}
|
||||||
|
|
||||||
|
@router.get("/{rule_id}/versions")
|
||||||
|
async def get_rule_versions(
|
||||||
|
rule_id: int,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Retrieve version history for a specific rule.
|
||||||
|
"""
|
||||||
|
rule = db.query(Rule).filter(Rule.id == rule_id).first()
|
||||||
|
if not rule:
|
||||||
|
raise HTTPException(status_code=404, detail="Rule not found")
|
||||||
|
|
||||||
|
versions = db.query(RuleVersion).filter(RuleVersion.rule_id == rule_id).order_by(desc(RuleVersion.version)).all()
|
||||||
|
|
||||||
|
# Convert to response format
|
||||||
|
result_versions = []
|
||||||
|
for version in versions:
|
||||||
|
version_dict = {
|
||||||
|
"id": version.id,
|
||||||
|
"rule_id": version.rule_id,
|
||||||
|
"version": version.version,
|
||||||
|
"name": version.name,
|
||||||
|
"description": version.description,
|
||||||
|
"rule_type": version.rule_type,
|
||||||
|
"conditions": json.loads(version.conditions),
|
||||||
|
"actions": json.loads(version.actions),
|
||||||
|
"priority": version.priority,
|
||||||
|
"created_by": version.created_by,
|
||||||
|
"created_at": version.created_at
|
||||||
|
}
|
||||||
|
result_versions.append(version_dict)
|
||||||
|
|
||||||
|
return result_versions
|
||||||
|
|
||||||
|
@router.post("/{rule_id}/rollback/{version}")
|
||||||
|
async def rollback_rule(
|
||||||
|
rule_id: int,
|
||||||
|
version: int,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Rollback a rule to a previous version.
|
||||||
|
"""
|
||||||
|
rule = db.query(Rule).filter(Rule.id == rule_id).first()
|
||||||
|
if not rule:
|
||||||
|
raise HTTPException(status_code=404, detail="Rule not found")
|
||||||
|
|
||||||
|
target_version = db.query(RuleVersion).filter(
|
||||||
|
RuleVersion.rule_id == rule_id,
|
||||||
|
RuleVersion.version == version
|
||||||
|
).first()
|
||||||
|
|
||||||
|
if not target_version:
|
||||||
|
raise HTTPException(status_code=404, detail="Rule version not found")
|
||||||
|
|
||||||
|
# Save current version before rollback
|
||||||
|
current_version = RuleVersion(
|
||||||
|
rule_id=rule.id,
|
||||||
|
version=rule.version,
|
||||||
|
name=rule.name,
|
||||||
|
description=rule.description,
|
||||||
|
rule_type=rule.rule_type,
|
||||||
|
conditions=rule.conditions,
|
||||||
|
actions=rule.actions,
|
||||||
|
priority=rule.priority,
|
||||||
|
created_by=rule.created_by
|
||||||
|
)
|
||||||
|
db.add(current_version)
|
||||||
|
|
||||||
|
# Rollback to target version
|
||||||
|
rule.name = target_version.name
|
||||||
|
rule.description = target_version.description
|
||||||
|
rule.rule_type = target_version.rule_type
|
||||||
|
rule.conditions = target_version.conditions
|
||||||
|
rule.actions = target_version.actions
|
||||||
|
rule.priority = target_version.priority
|
||||||
|
rule.version += 1 # Increment version even for rollback
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
db.refresh(rule)
|
||||||
|
|
||||||
|
# Convert to response format
|
||||||
|
result = RuleSchema.from_orm(rule)
|
||||||
|
result.conditions = json.loads(rule.conditions)
|
||||||
|
result.actions = json.loads(rule.actions)
|
||||||
|
|
||||||
|
return result
|
528
app/api/screening.py
Normal file
528
app/api/screening.py
Normal file
@ -0,0 +1,528 @@
|
|||||||
|
import json
|
||||||
|
import time
|
||||||
|
import uuid
|
||||||
|
from typing import List, Optional
|
||||||
|
from datetime import datetime
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, Query, BackgroundTasks
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy import desc, func
|
||||||
|
|
||||||
|
from app.db.session import get_db
|
||||||
|
from app.models.screening import ScreeningResult, ScreeningBatch
|
||||||
|
from app.models.transaction import Transaction
|
||||||
|
from app.models.rule import Rule
|
||||||
|
from app.services.rule_engine import RuleEngine
|
||||||
|
from app.core.schemas import (
|
||||||
|
ScreeningResponse, BatchScreeningRequest, ScreeningBatch as ScreeningBatchSchema,
|
||||||
|
PaginatedResponse, AggregateRequest, AggregateResponse
|
||||||
|
)
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
@router.post("/transactions/{transaction_id}", response_model=ScreeningResponse)
|
||||||
|
async def screen_transaction(
|
||||||
|
transaction_id: str,
|
||||||
|
rule_ids: Optional[List[int]] = None,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Screen a single transaction against fraud detection rules in real-time.
|
||||||
|
|
||||||
|
This endpoint evaluates a transaction against all active rules or specific rules
|
||||||
|
and returns the screening results immediately. Suitable for real-time fraud detection.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
transaction_id: The ID of the transaction to screen
|
||||||
|
rule_ids: Optional list of specific rule IDs to apply (if None, applies all active rules)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
ScreeningResponse with detailed results and overall risk assessment
|
||||||
|
"""
|
||||||
|
start_time = time.time()
|
||||||
|
|
||||||
|
# Initialize rule engine
|
||||||
|
rule_engine = RuleEngine(db)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Evaluate transaction
|
||||||
|
screening_results = rule_engine.evaluate_transaction(transaction_id, rule_ids)
|
||||||
|
|
||||||
|
# Save results to database
|
||||||
|
db_results = []
|
||||||
|
total_risk_score = 0.0
|
||||||
|
flagged_count = 0
|
||||||
|
|
||||||
|
for result in screening_results:
|
||||||
|
db_result = ScreeningResult(
|
||||||
|
transaction_id=result.transaction_id,
|
||||||
|
rule_id=result.rule_id,
|
||||||
|
rule_name=result.rule_name,
|
||||||
|
rule_version=result.rule_version,
|
||||||
|
status=result.status,
|
||||||
|
risk_score=result.risk_score,
|
||||||
|
details=json.dumps(result.details) if result.details else None,
|
||||||
|
aggregated_data=json.dumps(result.aggregated_data) if result.aggregated_data else None,
|
||||||
|
screening_type=result.screening_type
|
||||||
|
)
|
||||||
|
db.add(db_result)
|
||||||
|
db_results.append(db_result)
|
||||||
|
|
||||||
|
total_risk_score += result.risk_score
|
||||||
|
if result.status == "flagged":
|
||||||
|
flagged_count += 1
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
# Refresh to get IDs
|
||||||
|
for db_result in db_results:
|
||||||
|
db.refresh(db_result)
|
||||||
|
|
||||||
|
# Determine overall status
|
||||||
|
overall_status = "flagged" if flagged_count > 0 else "clean"
|
||||||
|
|
||||||
|
# Convert to response format
|
||||||
|
response_results = []
|
||||||
|
for i, db_result in enumerate(db_results):
|
||||||
|
result_dict = {
|
||||||
|
"id": db_result.id,
|
||||||
|
"transaction_id": db_result.transaction_id,
|
||||||
|
"rule_id": db_result.rule_id,
|
||||||
|
"rule_name": db_result.rule_name,
|
||||||
|
"rule_version": db_result.rule_version,
|
||||||
|
"status": db_result.status,
|
||||||
|
"risk_score": db_result.risk_score,
|
||||||
|
"details": json.loads(db_result.details) if db_result.details else None,
|
||||||
|
"aggregated_data": json.loads(db_result.aggregated_data) if db_result.aggregated_data else None,
|
||||||
|
"screening_type": db_result.screening_type,
|
||||||
|
"created_at": db_result.created_at
|
||||||
|
}
|
||||||
|
response_results.append(result_dict)
|
||||||
|
|
||||||
|
screening_duration = (time.time() - start_time) * 1000
|
||||||
|
|
||||||
|
return ScreeningResponse(
|
||||||
|
transaction_id=transaction_id,
|
||||||
|
results=response_results,
|
||||||
|
overall_status=overall_status,
|
||||||
|
total_risk_score=total_risk_score,
|
||||||
|
screening_duration_ms=screening_duration
|
||||||
|
)
|
||||||
|
|
||||||
|
except ValueError as e:
|
||||||
|
raise HTTPException(status_code=404, detail=str(e))
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(status_code=500, detail=f"Screening failed: {str(e)}")
|
||||||
|
|
||||||
|
@router.post("/batch", response_model=ScreeningBatchSchema)
|
||||||
|
async def create_batch_screening(
|
||||||
|
request: BatchScreeningRequest,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Create a batch screening job for multiple transactions.
|
||||||
|
|
||||||
|
This endpoint creates a batch job that will screen multiple transactions
|
||||||
|
based on the provided filters. The job runs in the background and
|
||||||
|
results can be retrieved using the batch endpoints.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
request: Batch screening configuration including filters and rules
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
ScreeningBatchSchema with batch job details
|
||||||
|
"""
|
||||||
|
# Generate unique batch ID
|
||||||
|
batch_id = str(uuid.uuid4())
|
||||||
|
|
||||||
|
# Count transactions that match the filters
|
||||||
|
query = db.query(Transaction)
|
||||||
|
|
||||||
|
if request.transaction_filters:
|
||||||
|
# Apply transaction filters
|
||||||
|
filters = request.transaction_filters
|
||||||
|
if "user_id" in filters:
|
||||||
|
query = query.filter(Transaction.user_id == filters["user_id"])
|
||||||
|
if "account_id" in filters:
|
||||||
|
query = query.filter(Transaction.account_id == filters["account_id"])
|
||||||
|
if "transaction_type" in filters:
|
||||||
|
query = query.filter(Transaction.transaction_type == filters["transaction_type"])
|
||||||
|
if "min_amount" in filters:
|
||||||
|
query = query.filter(Transaction.amount >= filters["min_amount"])
|
||||||
|
if "max_amount" in filters:
|
||||||
|
query = query.filter(Transaction.amount <= filters["max_amount"])
|
||||||
|
if "channel" in filters:
|
||||||
|
query = query.filter(Transaction.channel == filters["channel"])
|
||||||
|
if "status" in filters:
|
||||||
|
query = query.filter(Transaction.status == filters["status"])
|
||||||
|
|
||||||
|
if request.date_from:
|
||||||
|
query = query.filter(Transaction.created_at >= request.date_from)
|
||||||
|
if request.date_to:
|
||||||
|
query = query.filter(Transaction.created_at <= request.date_to)
|
||||||
|
|
||||||
|
total_transactions = query.count()
|
||||||
|
|
||||||
|
# Get rule IDs to apply
|
||||||
|
rule_ids_to_apply = request.rule_ids
|
||||||
|
if not rule_ids_to_apply:
|
||||||
|
# Use all active rules
|
||||||
|
active_rules = db.query(Rule.id).filter(Rule.is_active).all()
|
||||||
|
rule_ids_to_apply = [rule.id for rule in active_rules]
|
||||||
|
|
||||||
|
# Create batch record
|
||||||
|
batch = ScreeningBatch(
|
||||||
|
batch_id=batch_id,
|
||||||
|
name=request.name,
|
||||||
|
description=request.description,
|
||||||
|
status="pending",
|
||||||
|
total_transactions=total_transactions,
|
||||||
|
processed_transactions=0,
|
||||||
|
flagged_transactions=0,
|
||||||
|
rules_applied=json.dumps(rule_ids_to_apply)
|
||||||
|
)
|
||||||
|
|
||||||
|
db.add(batch)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(batch)
|
||||||
|
|
||||||
|
# Add background task to process the batch
|
||||||
|
background_tasks.add_task(
|
||||||
|
process_batch_screening,
|
||||||
|
batch.id,
|
||||||
|
request.dict(),
|
||||||
|
rule_ids_to_apply
|
||||||
|
)
|
||||||
|
|
||||||
|
# Convert to response format
|
||||||
|
result = ScreeningBatchSchema.from_orm(batch)
|
||||||
|
result.rules_applied = rule_ids_to_apply
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
@router.get("/batch", response_model=PaginatedResponse)
|
||||||
|
async def get_screening_batches(
|
||||||
|
page: int = Query(1, ge=1),
|
||||||
|
page_size: int = Query(100, ge=1, le=1000),
|
||||||
|
status: Optional[str] = None,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Retrieve screening batch jobs with filtering and pagination.
|
||||||
|
"""
|
||||||
|
query = db.query(ScreeningBatch)
|
||||||
|
|
||||||
|
if status:
|
||||||
|
query = query.filter(ScreeningBatch.status == status)
|
||||||
|
|
||||||
|
# Get total count
|
||||||
|
total = query.count()
|
||||||
|
|
||||||
|
# Apply pagination
|
||||||
|
offset = (page - 1) * page_size
|
||||||
|
batches = query.order_by(desc(ScreeningBatch.created_at)).offset(offset).limit(page_size).all()
|
||||||
|
|
||||||
|
# Convert to response format
|
||||||
|
items = []
|
||||||
|
for batch in batches:
|
||||||
|
result = ScreeningBatchSchema.from_orm(batch)
|
||||||
|
if batch.rules_applied:
|
||||||
|
result.rules_applied = json.loads(batch.rules_applied)
|
||||||
|
items.append(result)
|
||||||
|
|
||||||
|
return PaginatedResponse(
|
||||||
|
items=items,
|
||||||
|
total=total,
|
||||||
|
page=page,
|
||||||
|
page_size=page_size,
|
||||||
|
total_pages=(total + page_size - 1) // page_size
|
||||||
|
)
|
||||||
|
|
||||||
|
@router.get("/batch/{batch_id}", response_model=ScreeningBatchSchema)
|
||||||
|
async def get_screening_batch(
|
||||||
|
batch_id: str,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Retrieve a specific screening batch by ID.
|
||||||
|
"""
|
||||||
|
batch = db.query(ScreeningBatch).filter(ScreeningBatch.batch_id == batch_id).first()
|
||||||
|
if not batch:
|
||||||
|
raise HTTPException(status_code=404, detail="Batch not found")
|
||||||
|
|
||||||
|
result = ScreeningBatchSchema.from_orm(batch)
|
||||||
|
if batch.rules_applied:
|
||||||
|
result.rules_applied = json.loads(batch.rules_applied)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
@router.get("/results", response_model=PaginatedResponse)
|
||||||
|
async def get_screening_results(
|
||||||
|
page: int = Query(1, ge=1),
|
||||||
|
page_size: int = Query(100, ge=1, le=1000),
|
||||||
|
transaction_id: Optional[str] = None,
|
||||||
|
rule_id: Optional[int] = None,
|
||||||
|
status: Optional[str] = None,
|
||||||
|
screening_type: Optional[str] = None,
|
||||||
|
min_risk_score: Optional[float] = None,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Retrieve screening results with filtering and pagination.
|
||||||
|
|
||||||
|
This endpoint provides audit and reporting capabilities for screening results.
|
||||||
|
Results can be filtered by various criteria for compliance and analysis.
|
||||||
|
"""
|
||||||
|
query = db.query(ScreeningResult)
|
||||||
|
|
||||||
|
# Apply filters
|
||||||
|
if transaction_id:
|
||||||
|
query = query.filter(ScreeningResult.transaction_id == transaction_id)
|
||||||
|
if rule_id:
|
||||||
|
query = query.filter(ScreeningResult.rule_id == rule_id)
|
||||||
|
if status:
|
||||||
|
query = query.filter(ScreeningResult.status == status)
|
||||||
|
if screening_type:
|
||||||
|
query = query.filter(ScreeningResult.screening_type == screening_type)
|
||||||
|
if min_risk_score is not None:
|
||||||
|
query = query.filter(ScreeningResult.risk_score >= min_risk_score)
|
||||||
|
|
||||||
|
# Get total count
|
||||||
|
total = query.count()
|
||||||
|
|
||||||
|
# Apply pagination
|
||||||
|
offset = (page - 1) * page_size
|
||||||
|
results = query.order_by(desc(ScreeningResult.created_at)).offset(offset).limit(page_size).all()
|
||||||
|
|
||||||
|
# Convert to response format
|
||||||
|
items = []
|
||||||
|
for result in results:
|
||||||
|
result_dict = {
|
||||||
|
"id": result.id,
|
||||||
|
"transaction_id": result.transaction_id,
|
||||||
|
"rule_id": result.rule_id,
|
||||||
|
"rule_name": result.rule_name,
|
||||||
|
"rule_version": result.rule_version,
|
||||||
|
"status": result.status,
|
||||||
|
"risk_score": result.risk_score,
|
||||||
|
"details": json.loads(result.details) if result.details else None,
|
||||||
|
"aggregated_data": json.loads(result.aggregated_data) if result.aggregated_data else None,
|
||||||
|
"screening_type": result.screening_type,
|
||||||
|
"created_at": result.created_at
|
||||||
|
}
|
||||||
|
items.append(result_dict)
|
||||||
|
|
||||||
|
return PaginatedResponse(
|
||||||
|
items=items,
|
||||||
|
total=total,
|
||||||
|
page=page,
|
||||||
|
page_size=page_size,
|
||||||
|
total_pages=(total + page_size - 1) // page_size
|
||||||
|
)
|
||||||
|
|
||||||
|
@router.post("/aggregate", response_model=AggregateResponse)
|
||||||
|
async def compute_aggregate(
|
||||||
|
request: AggregateRequest,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Compute aggregate values for transactions with optional caching.
|
||||||
|
|
||||||
|
This endpoint allows computing various aggregations on transaction data
|
||||||
|
such as sum, count, average, etc. Results are cached for performance.
|
||||||
|
|
||||||
|
Example request:
|
||||||
|
{
|
||||||
|
"aggregate_function": "sum",
|
||||||
|
"field": "amount",
|
||||||
|
"group_by": ["user_id"],
|
||||||
|
"filters": {"transaction_type": "debit"},
|
||||||
|
"time_window": "24h"
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
start_time = time.time()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Build query
|
||||||
|
query = db.query(Transaction)
|
||||||
|
|
||||||
|
# Apply filters
|
||||||
|
if request.filters:
|
||||||
|
for field, value in request.filters.items():
|
||||||
|
if hasattr(Transaction, field):
|
||||||
|
query = query.filter(getattr(Transaction, field) == value)
|
||||||
|
|
||||||
|
# Apply date range
|
||||||
|
if request.date_from:
|
||||||
|
query = query.filter(Transaction.created_at >= request.date_from)
|
||||||
|
if request.date_to:
|
||||||
|
query = query.filter(Transaction.created_at <= request.date_to)
|
||||||
|
|
||||||
|
# Apply time window
|
||||||
|
if request.time_window:
|
||||||
|
rule_engine = RuleEngine(db)
|
||||||
|
time_delta = rule_engine._parse_time_window(request.time_window)
|
||||||
|
cutoff_time = datetime.utcnow() - time_delta
|
||||||
|
query = query.filter(Transaction.created_at >= cutoff_time)
|
||||||
|
|
||||||
|
# Compute aggregate
|
||||||
|
if request.aggregate_function == "count":
|
||||||
|
if request.field == "*":
|
||||||
|
result = query.count()
|
||||||
|
else:
|
||||||
|
field_attr = getattr(Transaction, request.field, None)
|
||||||
|
if field_attr:
|
||||||
|
result = query.filter(field_attr.isnot(None)).count()
|
||||||
|
else:
|
||||||
|
result = 0
|
||||||
|
|
||||||
|
elif request.aggregate_function == "sum":
|
||||||
|
field_attr = getattr(Transaction, request.field, None)
|
||||||
|
if field_attr:
|
||||||
|
result = query.with_entities(func.sum(field_attr)).scalar() or 0
|
||||||
|
else:
|
||||||
|
result = 0
|
||||||
|
|
||||||
|
elif request.aggregate_function == "avg":
|
||||||
|
field_attr = getattr(Transaction, request.field, None)
|
||||||
|
if field_attr:
|
||||||
|
result = query.with_entities(func.avg(field_attr)).scalar() or 0
|
||||||
|
else:
|
||||||
|
result = 0
|
||||||
|
|
||||||
|
elif request.aggregate_function == "max":
|
||||||
|
field_attr = getattr(Transaction, request.field, None)
|
||||||
|
if field_attr:
|
||||||
|
result = query.with_entities(func.max(field_attr)).scalar() or 0
|
||||||
|
else:
|
||||||
|
result = 0
|
||||||
|
|
||||||
|
elif request.aggregate_function == "min":
|
||||||
|
field_attr = getattr(Transaction, request.field, None)
|
||||||
|
if field_attr:
|
||||||
|
result = query.with_entities(func.min(field_attr)).scalar() or 0
|
||||||
|
else:
|
||||||
|
result = 0
|
||||||
|
|
||||||
|
else:
|
||||||
|
raise HTTPException(status_code=400, detail=f"Unsupported aggregate function: {request.aggregate_function}")
|
||||||
|
|
||||||
|
# Handle group by
|
||||||
|
if request.group_by:
|
||||||
|
# This is a simplified implementation
|
||||||
|
# In a production system, you might want to use SQL GROUP BY
|
||||||
|
result = {"total": result, "note": "Group by aggregation simplified for demo"}
|
||||||
|
|
||||||
|
computation_time = (time.time() - start_time) * 1000
|
||||||
|
|
||||||
|
return AggregateResponse(
|
||||||
|
result={"value": result} if not request.group_by else result,
|
||||||
|
cache_hit=False,
|
||||||
|
computation_time_ms=computation_time
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(status_code=500, detail=f"Aggregate computation failed: {str(e)}")
|
||||||
|
|
||||||
|
def process_batch_screening(batch_id: int, request_data: dict, rule_ids: List[int]):
|
||||||
|
"""
|
||||||
|
Background task to process batch screening.
|
||||||
|
"""
|
||||||
|
from app.db.session import SessionLocal
|
||||||
|
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
# Get batch record
|
||||||
|
batch = db.query(ScreeningBatch).filter(ScreeningBatch.id == batch_id).first()
|
||||||
|
if not batch:
|
||||||
|
return
|
||||||
|
|
||||||
|
# Update status to processing
|
||||||
|
batch.status = "processing"
|
||||||
|
batch.started_at = datetime.utcnow()
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
# Build transaction query
|
||||||
|
query = db.query(Transaction)
|
||||||
|
|
||||||
|
# Apply filters from request
|
||||||
|
request_obj = BatchScreeningRequest(**request_data)
|
||||||
|
if request_obj.transaction_filters:
|
||||||
|
filters = request_obj.transaction_filters
|
||||||
|
if "user_id" in filters:
|
||||||
|
query = query.filter(Transaction.user_id == filters["user_id"])
|
||||||
|
if "account_id" in filters:
|
||||||
|
query = query.filter(Transaction.account_id == filters["account_id"])
|
||||||
|
# Add more filters as needed
|
||||||
|
|
||||||
|
if request_obj.date_from:
|
||||||
|
query = query.filter(Transaction.created_at >= request_obj.date_from)
|
||||||
|
if request_obj.date_to:
|
||||||
|
query = query.filter(Transaction.created_at <= request_obj.date_to)
|
||||||
|
|
||||||
|
# Get transactions to process
|
||||||
|
transactions = query.all()
|
||||||
|
|
||||||
|
# Initialize rule engine
|
||||||
|
rule_engine = RuleEngine(db)
|
||||||
|
|
||||||
|
processed_count = 0
|
||||||
|
flagged_count = 0
|
||||||
|
|
||||||
|
# Process each transaction
|
||||||
|
for transaction in transactions:
|
||||||
|
try:
|
||||||
|
# Screen transaction
|
||||||
|
screening_results = rule_engine.evaluate_transaction(transaction.transaction_id, rule_ids)
|
||||||
|
|
||||||
|
# Save results
|
||||||
|
transaction_flagged = False
|
||||||
|
for result in screening_results:
|
||||||
|
db_result = ScreeningResult(
|
||||||
|
transaction_id=result.transaction_id,
|
||||||
|
rule_id=result.rule_id,
|
||||||
|
rule_name=result.rule_name,
|
||||||
|
rule_version=result.rule_version,
|
||||||
|
status=result.status,
|
||||||
|
risk_score=result.risk_score,
|
||||||
|
details=json.dumps(result.details) if result.details else None,
|
||||||
|
aggregated_data=json.dumps(result.aggregated_data) if result.aggregated_data else None,
|
||||||
|
screening_type="batch"
|
||||||
|
)
|
||||||
|
db.add(db_result)
|
||||||
|
|
||||||
|
if result.status == "flagged":
|
||||||
|
transaction_flagged = True
|
||||||
|
|
||||||
|
if transaction_flagged:
|
||||||
|
flagged_count += 1
|
||||||
|
|
||||||
|
processed_count += 1
|
||||||
|
|
||||||
|
# Update batch progress periodically
|
||||||
|
if processed_count % 100 == 0:
|
||||||
|
batch.processed_transactions = processed_count
|
||||||
|
batch.flagged_transactions = flagged_count
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
# Log error but continue processing
|
||||||
|
print(f"Error processing transaction {transaction.transaction_id}: {str(e)}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Update final batch status
|
||||||
|
batch.status = "completed"
|
||||||
|
batch.processed_transactions = processed_count
|
||||||
|
batch.flagged_transactions = flagged_count
|
||||||
|
batch.completed_at = datetime.utcnow()
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
# Mark batch as failed
|
||||||
|
batch.status = "failed"
|
||||||
|
batch.completed_at = datetime.utcnow()
|
||||||
|
db.commit()
|
||||||
|
print(f"Batch screening failed: {str(e)}")
|
||||||
|
|
||||||
|
finally:
|
||||||
|
db.close()
|
224
app/api/transactions.py
Normal file
224
app/api/transactions.py
Normal file
@ -0,0 +1,224 @@
|
|||||||
|
from typing import List, Optional
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, Query
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy import desc
|
||||||
|
import json
|
||||||
|
|
||||||
|
from app.db.session import get_db
|
||||||
|
from app.models.transaction import Transaction
|
||||||
|
from app.core.schemas import TransactionCreate, TransactionUpdate, Transaction as TransactionSchema, PaginatedResponse
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
@router.post("/", response_model=TransactionSchema)
|
||||||
|
async def create_transaction(
|
||||||
|
transaction: TransactionCreate,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Ingest a new transaction into the system.
|
||||||
|
|
||||||
|
This endpoint accepts transaction data and stores it for fraud screening.
|
||||||
|
The transaction will be available for real-time screening immediately.
|
||||||
|
"""
|
||||||
|
# Check if transaction already exists
|
||||||
|
existing = db.query(Transaction).filter(Transaction.transaction_id == transaction.transaction_id).first()
|
||||||
|
if existing:
|
||||||
|
raise HTTPException(status_code=400, detail="Transaction with this ID already exists")
|
||||||
|
|
||||||
|
# Convert metadata to JSON string if provided
|
||||||
|
metadata_json = json.dumps(transaction.metadata) if transaction.metadata else None
|
||||||
|
|
||||||
|
db_transaction = Transaction(
|
||||||
|
transaction_id=transaction.transaction_id,
|
||||||
|
user_id=transaction.user_id,
|
||||||
|
account_id=transaction.account_id,
|
||||||
|
amount=transaction.amount,
|
||||||
|
currency=transaction.currency,
|
||||||
|
transaction_type=transaction.transaction_type,
|
||||||
|
merchant_id=transaction.merchant_id,
|
||||||
|
merchant_category=transaction.merchant_category,
|
||||||
|
channel=transaction.channel,
|
||||||
|
location=transaction.location,
|
||||||
|
ip_address=transaction.ip_address,
|
||||||
|
device_id=transaction.device_id,
|
||||||
|
status=transaction.status,
|
||||||
|
metadata=metadata_json
|
||||||
|
)
|
||||||
|
|
||||||
|
db.add(db_transaction)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(db_transaction)
|
||||||
|
|
||||||
|
# Convert metadata back to dict for response
|
||||||
|
result = TransactionSchema.from_orm(db_transaction)
|
||||||
|
if db_transaction.metadata:
|
||||||
|
result.metadata = json.loads(db_transaction.metadata)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
@router.post("/bulk", response_model=List[TransactionSchema])
|
||||||
|
async def create_transactions_bulk(
|
||||||
|
transactions: List[TransactionCreate],
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Ingest multiple transactions in bulk.
|
||||||
|
|
||||||
|
This endpoint is optimized for high-volume transaction ingestion.
|
||||||
|
Duplicate transaction IDs will be skipped.
|
||||||
|
"""
|
||||||
|
created_transactions = []
|
||||||
|
|
||||||
|
for transaction in transactions:
|
||||||
|
# Check if transaction already exists
|
||||||
|
existing = db.query(Transaction).filter(Transaction.transaction_id == transaction.transaction_id).first()
|
||||||
|
if existing:
|
||||||
|
continue
|
||||||
|
|
||||||
|
metadata_json = json.dumps(transaction.metadata) if transaction.metadata else None
|
||||||
|
|
||||||
|
db_transaction = Transaction(
|
||||||
|
transaction_id=transaction.transaction_id,
|
||||||
|
user_id=transaction.user_id,
|
||||||
|
account_id=transaction.account_id,
|
||||||
|
amount=transaction.amount,
|
||||||
|
currency=transaction.currency,
|
||||||
|
transaction_type=transaction.transaction_type,
|
||||||
|
merchant_id=transaction.merchant_id,
|
||||||
|
merchant_category=transaction.merchant_category,
|
||||||
|
channel=transaction.channel,
|
||||||
|
location=transaction.location,
|
||||||
|
ip_address=transaction.ip_address,
|
||||||
|
device_id=transaction.device_id,
|
||||||
|
status=transaction.status,
|
||||||
|
metadata=metadata_json
|
||||||
|
)
|
||||||
|
|
||||||
|
db.add(db_transaction)
|
||||||
|
created_transactions.append(db_transaction)
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
# Refresh and convert to response format
|
||||||
|
results = []
|
||||||
|
for db_transaction in created_transactions:
|
||||||
|
db.refresh(db_transaction)
|
||||||
|
result = TransactionSchema.from_orm(db_transaction)
|
||||||
|
if db_transaction.metadata:
|
||||||
|
result.metadata = json.loads(db_transaction.metadata)
|
||||||
|
results.append(result)
|
||||||
|
|
||||||
|
return results
|
||||||
|
|
||||||
|
@router.get("/", response_model=PaginatedResponse)
|
||||||
|
async def get_transactions(
|
||||||
|
page: int = Query(1, ge=1),
|
||||||
|
page_size: int = Query(100, ge=1, le=1000),
|
||||||
|
user_id: Optional[str] = None,
|
||||||
|
account_id: Optional[str] = None,
|
||||||
|
transaction_type: Optional[str] = None,
|
||||||
|
channel: Optional[str] = None,
|
||||||
|
status: Optional[str] = None,
|
||||||
|
min_amount: Optional[float] = None,
|
||||||
|
max_amount: Optional[float] = None,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Retrieve transactions with filtering and pagination.
|
||||||
|
|
||||||
|
Supports filtering by various transaction attributes and returns
|
||||||
|
paginated results for efficient data retrieval.
|
||||||
|
"""
|
||||||
|
query = db.query(Transaction)
|
||||||
|
|
||||||
|
# Apply filters
|
||||||
|
if user_id:
|
||||||
|
query = query.filter(Transaction.user_id == user_id)
|
||||||
|
if account_id:
|
||||||
|
query = query.filter(Transaction.account_id == account_id)
|
||||||
|
if transaction_type:
|
||||||
|
query = query.filter(Transaction.transaction_type == transaction_type)
|
||||||
|
if channel:
|
||||||
|
query = query.filter(Transaction.channel == channel)
|
||||||
|
if status:
|
||||||
|
query = query.filter(Transaction.status == status)
|
||||||
|
if min_amount is not None:
|
||||||
|
query = query.filter(Transaction.amount >= min_amount)
|
||||||
|
if max_amount is not None:
|
||||||
|
query = query.filter(Transaction.amount <= max_amount)
|
||||||
|
|
||||||
|
# Get total count
|
||||||
|
total = query.count()
|
||||||
|
|
||||||
|
# Apply pagination
|
||||||
|
offset = (page - 1) * page_size
|
||||||
|
transactions = query.order_by(desc(Transaction.created_at)).offset(offset).limit(page_size).all()
|
||||||
|
|
||||||
|
# Convert to response format
|
||||||
|
items = []
|
||||||
|
for transaction in transactions:
|
||||||
|
result = TransactionSchema.from_orm(transaction)
|
||||||
|
if transaction.metadata:
|
||||||
|
result.metadata = json.loads(transaction.metadata)
|
||||||
|
items.append(result)
|
||||||
|
|
||||||
|
return PaginatedResponse(
|
||||||
|
items=items,
|
||||||
|
total=total,
|
||||||
|
page=page,
|
||||||
|
page_size=page_size,
|
||||||
|
total_pages=(total + page_size - 1) // page_size
|
||||||
|
)
|
||||||
|
|
||||||
|
@router.get("/{transaction_id}", response_model=TransactionSchema)
|
||||||
|
async def get_transaction(
|
||||||
|
transaction_id: str,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Retrieve a specific transaction by ID.
|
||||||
|
"""
|
||||||
|
transaction = db.query(Transaction).filter(Transaction.transaction_id == transaction_id).first()
|
||||||
|
if not transaction:
|
||||||
|
raise HTTPException(status_code=404, detail="Transaction not found")
|
||||||
|
|
||||||
|
result = TransactionSchema.from_orm(transaction)
|
||||||
|
if transaction.metadata:
|
||||||
|
result.metadata = json.loads(transaction.metadata)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
@router.patch("/{transaction_id}", response_model=TransactionSchema)
|
||||||
|
async def update_transaction(
|
||||||
|
transaction_id: str,
|
||||||
|
transaction_update: TransactionUpdate,
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Update a transaction's status or metadata.
|
||||||
|
|
||||||
|
This endpoint allows updating transaction status (e.g., from pending to completed)
|
||||||
|
and adding additional metadata without modifying core transaction data.
|
||||||
|
"""
|
||||||
|
transaction = db.query(Transaction).filter(Transaction.transaction_id == transaction_id).first()
|
||||||
|
if not transaction:
|
||||||
|
raise HTTPException(status_code=404, detail="Transaction not found")
|
||||||
|
|
||||||
|
if transaction_update.status is not None:
|
||||||
|
transaction.status = transaction_update.status
|
||||||
|
|
||||||
|
if transaction_update.metadata is not None:
|
||||||
|
# Merge with existing metadata
|
||||||
|
existing_metadata = json.loads(transaction.metadata) if transaction.metadata else {}
|
||||||
|
existing_metadata.update(transaction_update.metadata)
|
||||||
|
transaction.metadata = json.dumps(existing_metadata)
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
db.refresh(transaction)
|
||||||
|
|
||||||
|
result = TransactionSchema.from_orm(transaction)
|
||||||
|
if transaction.metadata:
|
||||||
|
result.metadata = json.loads(transaction.metadata)
|
||||||
|
|
||||||
|
return result
|
0
app/core/__init__.py
Normal file
0
app/core/__init__.py
Normal file
14
app/core/config.py
Normal file
14
app/core/config.py
Normal file
@ -0,0 +1,14 @@
|
|||||||
|
import os
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
class Settings:
|
||||||
|
PROJECT_NAME: str = "Transaction Fraud Monitoring API"
|
||||||
|
VERSION: str = "1.0.0"
|
||||||
|
API_V1_STR: str = "/api/v1"
|
||||||
|
|
||||||
|
DATABASE_URL: Optional[str] = os.getenv("DATABASE_URL")
|
||||||
|
SECRET_KEY: str = os.getenv("SECRET_KEY", "your-secret-key-here")
|
||||||
|
ALGORITHM: str = "HS256"
|
||||||
|
ACCESS_TOKEN_EXPIRE_MINUTES: int = 30
|
||||||
|
|
||||||
|
settings = Settings()
|
151
app/core/schemas.py
Normal file
151
app/core/schemas.py
Normal file
@ -0,0 +1,151 @@
|
|||||||
|
from typing import Optional, Dict, Any, List
|
||||||
|
from datetime import datetime
|
||||||
|
from pydantic import BaseModel
|
||||||
|
|
||||||
|
# Transaction Schemas
|
||||||
|
class TransactionBase(BaseModel):
|
||||||
|
transaction_id: str
|
||||||
|
user_id: str
|
||||||
|
account_id: str
|
||||||
|
amount: float
|
||||||
|
currency: str = "NGN"
|
||||||
|
transaction_type: str # debit, credit, transfer
|
||||||
|
merchant_id: Optional[str] = None
|
||||||
|
merchant_category: Optional[str] = None
|
||||||
|
channel: str # web, mobile, atm, pos
|
||||||
|
location: Optional[str] = None
|
||||||
|
ip_address: Optional[str] = None
|
||||||
|
device_id: Optional[str] = None
|
||||||
|
status: str = "pending"
|
||||||
|
metadata: Optional[Dict[str, Any]] = None
|
||||||
|
|
||||||
|
class TransactionCreate(TransactionBase):
|
||||||
|
pass
|
||||||
|
|
||||||
|
class TransactionUpdate(BaseModel):
|
||||||
|
status: Optional[str] = None
|
||||||
|
metadata: Optional[Dict[str, Any]] = None
|
||||||
|
|
||||||
|
class Transaction(TransactionBase):
|
||||||
|
id: int
|
||||||
|
created_at: datetime
|
||||||
|
updated_at: Optional[datetime] = None
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
|
||||||
|
# Rule Schemas
|
||||||
|
class RuleCondition(BaseModel):
|
||||||
|
field: str
|
||||||
|
operator: str # eq, ne, gt, gte, lt, lte, in, not_in, contains, starts_with, ends_with
|
||||||
|
value: Any
|
||||||
|
aggregate_function: Optional[str] = None # sum, count, avg, max, min
|
||||||
|
time_window: Optional[str] = None # "24h", "7d", "30d", etc.
|
||||||
|
group_by: Optional[List[str]] = None # ["user_id", "account_id"]
|
||||||
|
|
||||||
|
class RuleAction(BaseModel):
|
||||||
|
action_type: str # flag, block, alert, score
|
||||||
|
parameters: Dict[str, Any] = {}
|
||||||
|
|
||||||
|
class RuleBase(BaseModel):
|
||||||
|
name: str
|
||||||
|
description: Optional[str] = None
|
||||||
|
rule_type: str # velocity, amount_limit, blacklist, pattern, etc.
|
||||||
|
conditions: List[RuleCondition]
|
||||||
|
actions: List[RuleAction]
|
||||||
|
priority: int = 1
|
||||||
|
is_active: bool = True
|
||||||
|
|
||||||
|
class RuleCreate(RuleBase):
|
||||||
|
created_by: Optional[str] = None
|
||||||
|
|
||||||
|
class RuleUpdate(BaseModel):
|
||||||
|
name: Optional[str] = None
|
||||||
|
description: Optional[str] = None
|
||||||
|
conditions: Optional[List[RuleCondition]] = None
|
||||||
|
actions: Optional[List[RuleAction]] = None
|
||||||
|
priority: Optional[int] = None
|
||||||
|
is_active: Optional[bool] = None
|
||||||
|
|
||||||
|
class Rule(RuleBase):
|
||||||
|
id: int
|
||||||
|
version: int
|
||||||
|
created_by: Optional[str] = None
|
||||||
|
created_at: datetime
|
||||||
|
updated_at: Optional[datetime] = None
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
|
||||||
|
# Screening Schemas
|
||||||
|
class ScreeningResultBase(BaseModel):
|
||||||
|
transaction_id: str
|
||||||
|
rule_id: int
|
||||||
|
rule_name: str
|
||||||
|
rule_version: int = 1
|
||||||
|
status: str # flagged, clean, error
|
||||||
|
risk_score: float = 0.0
|
||||||
|
details: Optional[Dict[str, Any]] = None
|
||||||
|
aggregated_data: Optional[Dict[str, Any]] = None
|
||||||
|
screening_type: str = "real_time"
|
||||||
|
|
||||||
|
class ScreeningResult(ScreeningResultBase):
|
||||||
|
id: int
|
||||||
|
created_at: datetime
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
|
||||||
|
class BatchScreeningRequest(BaseModel):
|
||||||
|
name: Optional[str] = None
|
||||||
|
description: Optional[str] = None
|
||||||
|
transaction_filters: Optional[Dict[str, Any]] = None
|
||||||
|
rule_ids: Optional[List[int]] = None # If None, apply all active rules
|
||||||
|
date_from: Optional[datetime] = None
|
||||||
|
date_to: Optional[datetime] = None
|
||||||
|
|
||||||
|
class ScreeningBatch(BaseModel):
|
||||||
|
id: int
|
||||||
|
batch_id: str
|
||||||
|
name: Optional[str] = None
|
||||||
|
description: Optional[str] = None
|
||||||
|
status: str
|
||||||
|
total_transactions: int = 0
|
||||||
|
processed_transactions: int = 0
|
||||||
|
flagged_transactions: int = 0
|
||||||
|
rules_applied: Optional[List[int]] = None
|
||||||
|
started_at: Optional[datetime] = None
|
||||||
|
completed_at: Optional[datetime] = None
|
||||||
|
created_at: datetime
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
|
||||||
|
# Response Schemas
|
||||||
|
class PaginatedResponse(BaseModel):
|
||||||
|
items: List[Any]
|
||||||
|
total: int
|
||||||
|
page: int
|
||||||
|
page_size: int
|
||||||
|
total_pages: int
|
||||||
|
|
||||||
|
class ScreeningResponse(BaseModel):
|
||||||
|
transaction_id: str
|
||||||
|
results: List[ScreeningResult]
|
||||||
|
overall_status: str # clean, flagged
|
||||||
|
total_risk_score: float
|
||||||
|
screening_duration_ms: float
|
||||||
|
|
||||||
|
class AggregateRequest(BaseModel):
|
||||||
|
aggregate_function: str # sum, count, avg, max, min
|
||||||
|
field: str
|
||||||
|
group_by: Optional[List[str]] = None
|
||||||
|
filters: Optional[Dict[str, Any]] = None
|
||||||
|
time_window: Optional[str] = None # "24h", "7d", "30d"
|
||||||
|
date_from: Optional[datetime] = None
|
||||||
|
date_to: Optional[datetime] = None
|
||||||
|
|
||||||
|
class AggregateResponse(BaseModel):
|
||||||
|
result: Dict[str, Any]
|
||||||
|
cache_hit: bool = False
|
||||||
|
computation_time_ms: float
|
0
app/db/__init__.py
Normal file
0
app/db/__init__.py
Normal file
3
app/db/base.py
Normal file
3
app/db/base.py
Normal file
@ -0,0 +1,3 @@
|
|||||||
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
|
|
||||||
|
Base = declarative_base()
|
22
app/db/session.py
Normal file
22
app/db/session.py
Normal file
@ -0,0 +1,22 @@
|
|||||||
|
from pathlib import Path
|
||||||
|
from sqlalchemy import create_engine
|
||||||
|
from sqlalchemy.orm import sessionmaker
|
||||||
|
|
||||||
|
DB_DIR = Path("/app/storage/db")
|
||||||
|
DB_DIR.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
SQLALCHEMY_DATABASE_URL = f"sqlite:///{DB_DIR}/db.sqlite"
|
||||||
|
|
||||||
|
engine = create_engine(
|
||||||
|
SQLALCHEMY_DATABASE_URL,
|
||||||
|
connect_args={"check_same_thread": False}
|
||||||
|
)
|
||||||
|
|
||||||
|
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||||
|
|
||||||
|
def get_db():
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
yield db
|
||||||
|
finally:
|
||||||
|
db.close()
|
0
app/models/__init__.py
Normal file
0
app/models/__init__.py
Normal file
34
app/models/rule.py
Normal file
34
app/models/rule.py
Normal file
@ -0,0 +1,34 @@
|
|||||||
|
from sqlalchemy import Column, Integer, String, Text, Boolean, DateTime
|
||||||
|
from sqlalchemy.sql import func
|
||||||
|
from app.db.base import Base
|
||||||
|
|
||||||
|
class Rule(Base):
|
||||||
|
__tablename__ = "rules"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
name = Column(String, unique=True, index=True, nullable=False)
|
||||||
|
description = Column(Text, nullable=True)
|
||||||
|
rule_type = Column(String, nullable=False) # velocity, amount_limit, blacklist, pattern, etc.
|
||||||
|
conditions = Column(Text, nullable=False) # JSON string with rule conditions
|
||||||
|
actions = Column(Text, nullable=False) # JSON string with actions to take
|
||||||
|
priority = Column(Integer, default=1) # Higher number = higher priority
|
||||||
|
is_active = Column(Boolean, default=True)
|
||||||
|
version = Column(Integer, default=1)
|
||||||
|
created_by = Column(String, nullable=True)
|
||||||
|
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
||||||
|
updated_at = Column(DateTime(timezone=True), onupdate=func.now())
|
||||||
|
|
||||||
|
class RuleVersion(Base):
|
||||||
|
__tablename__ = "rule_versions"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
rule_id = Column(Integer, nullable=False, index=True)
|
||||||
|
version = Column(Integer, nullable=False)
|
||||||
|
name = Column(String, nullable=False)
|
||||||
|
description = Column(Text, nullable=True)
|
||||||
|
rule_type = Column(String, nullable=False)
|
||||||
|
conditions = Column(Text, nullable=False)
|
||||||
|
actions = Column(Text, nullable=False)
|
||||||
|
priority = Column(Integer, default=1)
|
||||||
|
created_by = Column(String, nullable=True)
|
||||||
|
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
43
app/models/screening.py
Normal file
43
app/models/screening.py
Normal file
@ -0,0 +1,43 @@
|
|||||||
|
from sqlalchemy import Column, Integer, String, Text, DateTime, Float
|
||||||
|
from sqlalchemy.sql import func
|
||||||
|
from app.db.base import Base
|
||||||
|
|
||||||
|
class ScreeningResult(Base):
|
||||||
|
__tablename__ = "screening_results"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
transaction_id = Column(String, index=True, nullable=False)
|
||||||
|
rule_id = Column(Integer, index=True, nullable=False)
|
||||||
|
rule_name = Column(String, nullable=False)
|
||||||
|
rule_version = Column(Integer, default=1)
|
||||||
|
status = Column(String, nullable=False) # flagged, clean, error
|
||||||
|
risk_score = Column(Float, default=0.0)
|
||||||
|
details = Column(Text, nullable=True) # JSON string with detailed results
|
||||||
|
aggregated_data = Column(Text, nullable=True) # JSON string with computed aggregations
|
||||||
|
screening_type = Column(String, default="real_time") # real_time, batch
|
||||||
|
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
||||||
|
|
||||||
|
class ScreeningBatch(Base):
|
||||||
|
__tablename__ = "screening_batches"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
batch_id = Column(String, unique=True, index=True, nullable=False)
|
||||||
|
name = Column(String, nullable=True)
|
||||||
|
description = Column(Text, nullable=True)
|
||||||
|
status = Column(String, default="pending") # pending, processing, completed, failed
|
||||||
|
total_transactions = Column(Integer, default=0)
|
||||||
|
processed_transactions = Column(Integer, default=0)
|
||||||
|
flagged_transactions = Column(Integer, default=0)
|
||||||
|
rules_applied = Column(Text, nullable=True) # JSON array of rule IDs
|
||||||
|
started_at = Column(DateTime(timezone=True), nullable=True)
|
||||||
|
completed_at = Column(DateTime(timezone=True), nullable=True)
|
||||||
|
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
||||||
|
|
||||||
|
class AggregateCache(Base):
|
||||||
|
__tablename__ = "aggregate_cache"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
cache_key = Column(String, unique=True, index=True, nullable=False)
|
||||||
|
cache_value = Column(Text, nullable=False) # JSON string
|
||||||
|
expires_at = Column(DateTime(timezone=True), nullable=False)
|
||||||
|
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
24
app/models/transaction.py
Normal file
24
app/models/transaction.py
Normal file
@ -0,0 +1,24 @@
|
|||||||
|
from sqlalchemy import Column, Integer, String, Float, DateTime, Text
|
||||||
|
from sqlalchemy.sql import func
|
||||||
|
from app.db.base import Base
|
||||||
|
|
||||||
|
class Transaction(Base):
|
||||||
|
__tablename__ = "transactions"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
transaction_id = Column(String, unique=True, index=True, nullable=False)
|
||||||
|
user_id = Column(String, index=True, nullable=False)
|
||||||
|
account_id = Column(String, index=True, nullable=False)
|
||||||
|
amount = Column(Float, nullable=False)
|
||||||
|
currency = Column(String, default="NGN")
|
||||||
|
transaction_type = Column(String, nullable=False) # debit, credit, transfer
|
||||||
|
merchant_id = Column(String, index=True, nullable=True)
|
||||||
|
merchant_category = Column(String, nullable=True)
|
||||||
|
channel = Column(String, nullable=False) # web, mobile, atm, pos
|
||||||
|
location = Column(String, nullable=True)
|
||||||
|
ip_address = Column(String, nullable=True)
|
||||||
|
device_id = Column(String, index=True, nullable=True)
|
||||||
|
status = Column(String, default="pending") # pending, completed, failed
|
||||||
|
metadata = Column(Text, nullable=True) # JSON string for additional fields
|
||||||
|
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
||||||
|
updated_at = Column(DateTime(timezone=True), onupdate=func.now())
|
0
app/services/__init__.py
Normal file
0
app/services/__init__.py
Normal file
371
app/services/rule_engine.py
Normal file
371
app/services/rule_engine.py
Normal file
@ -0,0 +1,371 @@
|
|||||||
|
import json
|
||||||
|
import time
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from typing import List, Dict, Any, Optional, Tuple
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy import func, desc
|
||||||
|
|
||||||
|
from app.models.transaction import Transaction
|
||||||
|
from app.models.rule import Rule
|
||||||
|
from app.models.screening import AggregateCache
|
||||||
|
from app.core.schemas import RuleCondition, ScreeningResultBase
|
||||||
|
|
||||||
|
class RuleEngine:
|
||||||
|
def __init__(self, db: Session):
|
||||||
|
self.db = db
|
||||||
|
|
||||||
|
def evaluate_transaction(self, transaction_id: str, rule_ids: Optional[List[int]] = None) -> List[ScreeningResultBase]:
|
||||||
|
"""
|
||||||
|
Evaluate a transaction against all active rules or specific rules.
|
||||||
|
"""
|
||||||
|
# Get transaction
|
||||||
|
transaction = self.db.query(Transaction).filter(Transaction.transaction_id == transaction_id).first()
|
||||||
|
if not transaction:
|
||||||
|
raise ValueError(f"Transaction {transaction_id} not found")
|
||||||
|
|
||||||
|
# Get rules to evaluate
|
||||||
|
query = self.db.query(Rule).filter(Rule.is_active)
|
||||||
|
if rule_ids:
|
||||||
|
query = query.filter(Rule.id.in_(rule_ids))
|
||||||
|
|
||||||
|
rules = query.order_by(desc(Rule.priority)).all()
|
||||||
|
|
||||||
|
results = []
|
||||||
|
for rule in rules:
|
||||||
|
result = self._evaluate_rule_against_transaction(transaction, rule)
|
||||||
|
results.append(result)
|
||||||
|
|
||||||
|
return results
|
||||||
|
|
||||||
|
def _evaluate_rule_against_transaction(self, transaction: Transaction, rule: Rule) -> ScreeningResultBase:
|
||||||
|
"""
|
||||||
|
Evaluate a single rule against a transaction.
|
||||||
|
"""
|
||||||
|
start_time = time.time()
|
||||||
|
|
||||||
|
try:
|
||||||
|
conditions = json.loads(rule.conditions)
|
||||||
|
actions = json.loads(rule.actions)
|
||||||
|
|
||||||
|
# Evaluate all conditions
|
||||||
|
condition_results = []
|
||||||
|
aggregated_data = {}
|
||||||
|
|
||||||
|
for condition in conditions:
|
||||||
|
condition_obj = RuleCondition(**condition)
|
||||||
|
result, agg_data = self._evaluate_condition(transaction, condition_obj)
|
||||||
|
condition_results.append(result)
|
||||||
|
if agg_data:
|
||||||
|
aggregated_data.update(agg_data)
|
||||||
|
|
||||||
|
# All conditions must be true for the rule to trigger
|
||||||
|
rule_triggered = all(condition_results)
|
||||||
|
|
||||||
|
# Determine status and risk score
|
||||||
|
if rule_triggered:
|
||||||
|
status = "flagged"
|
||||||
|
risk_score = self._calculate_risk_score(actions)
|
||||||
|
details = {
|
||||||
|
"rule_triggered": True,
|
||||||
|
"conditions_met": len(condition_results),
|
||||||
|
"evaluation_time_ms": (time.time() - start_time) * 1000,
|
||||||
|
"actions": actions
|
||||||
|
}
|
||||||
|
else:
|
||||||
|
status = "clean"
|
||||||
|
risk_score = 0.0
|
||||||
|
details = {
|
||||||
|
"rule_triggered": False,
|
||||||
|
"conditions_met": sum(condition_results),
|
||||||
|
"total_conditions": len(condition_results),
|
||||||
|
"evaluation_time_ms": (time.time() - start_time) * 1000
|
||||||
|
}
|
||||||
|
|
||||||
|
return ScreeningResultBase(
|
||||||
|
transaction_id=transaction.transaction_id,
|
||||||
|
rule_id=rule.id,
|
||||||
|
rule_name=rule.name,
|
||||||
|
rule_version=rule.version,
|
||||||
|
status=status,
|
||||||
|
risk_score=risk_score,
|
||||||
|
details=details,
|
||||||
|
aggregated_data=aggregated_data,
|
||||||
|
screening_type="real_time"
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
return ScreeningResultBase(
|
||||||
|
transaction_id=transaction.transaction_id,
|
||||||
|
rule_id=rule.id,
|
||||||
|
rule_name=rule.name,
|
||||||
|
rule_version=rule.version,
|
||||||
|
status="error",
|
||||||
|
risk_score=0.0,
|
||||||
|
details={"error": str(e), "evaluation_time_ms": (time.time() - start_time) * 1000},
|
||||||
|
aggregated_data={},
|
||||||
|
screening_type="real_time"
|
||||||
|
)
|
||||||
|
|
||||||
|
def _evaluate_condition(self, transaction: Transaction, condition: RuleCondition) -> Tuple[bool, Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Evaluate a single condition against a transaction.
|
||||||
|
"""
|
||||||
|
aggregated_data = {}
|
||||||
|
|
||||||
|
# If this is an aggregate condition, compute the aggregate first
|
||||||
|
if condition.aggregate_function:
|
||||||
|
aggregate_value, agg_data = self._compute_aggregate(transaction, condition)
|
||||||
|
aggregated_data = agg_data
|
||||||
|
value_to_compare = aggregate_value
|
||||||
|
else:
|
||||||
|
# Get the field value from transaction
|
||||||
|
value_to_compare = self._get_transaction_field_value(transaction, condition.field)
|
||||||
|
|
||||||
|
# Perform comparison
|
||||||
|
return self._compare_values(value_to_compare, condition.operator, condition.value), aggregated_data
|
||||||
|
|
||||||
|
def _get_transaction_field_value(self, transaction: Transaction, field: str) -> Any:
|
||||||
|
"""
|
||||||
|
Get a field value from a transaction object.
|
||||||
|
"""
|
||||||
|
if hasattr(transaction, field):
|
||||||
|
return getattr(transaction, field)
|
||||||
|
|
||||||
|
# Check in metadata if field not found in main attributes
|
||||||
|
if transaction.metadata:
|
||||||
|
metadata = json.loads(transaction.metadata)
|
||||||
|
return metadata.get(field)
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _compute_aggregate(self, transaction: Transaction, condition: RuleCondition) -> Tuple[Any, Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Compute aggregate values based on condition parameters.
|
||||||
|
"""
|
||||||
|
# Generate cache key
|
||||||
|
cache_key = self._generate_cache_key(transaction, condition)
|
||||||
|
|
||||||
|
# Check cache first
|
||||||
|
cached_result = self._get_cached_aggregate(cache_key)
|
||||||
|
if cached_result:
|
||||||
|
return cached_result["value"], cached_result
|
||||||
|
|
||||||
|
# Compute aggregate
|
||||||
|
query = self.db.query(Transaction)
|
||||||
|
|
||||||
|
# Apply time window filter
|
||||||
|
if condition.time_window:
|
||||||
|
time_delta = self._parse_time_window(condition.time_window)
|
||||||
|
cutoff_time = datetime.utcnow() - time_delta
|
||||||
|
query = query.filter(Transaction.created_at >= cutoff_time)
|
||||||
|
|
||||||
|
# Apply group by filters
|
||||||
|
if condition.group_by:
|
||||||
|
for group_field in condition.group_by:
|
||||||
|
group_value = self._get_transaction_field_value(transaction, group_field)
|
||||||
|
if group_value is not None:
|
||||||
|
if group_field == "user_id":
|
||||||
|
query = query.filter(Transaction.user_id == group_value)
|
||||||
|
elif group_field == "account_id":
|
||||||
|
query = query.filter(Transaction.account_id == group_value)
|
||||||
|
elif group_field == "device_id":
|
||||||
|
query = query.filter(Transaction.device_id == group_value)
|
||||||
|
# Add more group by fields as needed
|
||||||
|
|
||||||
|
# Apply aggregate function
|
||||||
|
if condition.aggregate_function == "count":
|
||||||
|
if condition.field == "*":
|
||||||
|
result = query.count()
|
||||||
|
else:
|
||||||
|
# Count non-null values of specific field
|
||||||
|
field_attr = getattr(Transaction, condition.field, None)
|
||||||
|
if field_attr:
|
||||||
|
result = query.filter(field_attr.isnot(None)).count()
|
||||||
|
else:
|
||||||
|
result = 0
|
||||||
|
|
||||||
|
elif condition.aggregate_function == "sum":
|
||||||
|
field_attr = getattr(Transaction, condition.field, None)
|
||||||
|
if field_attr:
|
||||||
|
result = query.with_entities(func.sum(field_attr)).scalar() or 0
|
||||||
|
else:
|
||||||
|
result = 0
|
||||||
|
|
||||||
|
elif condition.aggregate_function == "avg":
|
||||||
|
field_attr = getattr(Transaction, condition.field, None)
|
||||||
|
if field_attr:
|
||||||
|
result = query.with_entities(func.avg(field_attr)).scalar() or 0
|
||||||
|
else:
|
||||||
|
result = 0
|
||||||
|
|
||||||
|
elif condition.aggregate_function == "max":
|
||||||
|
field_attr = getattr(Transaction, condition.field, None)
|
||||||
|
if field_attr:
|
||||||
|
result = query.with_entities(func.max(field_attr)).scalar() or 0
|
||||||
|
else:
|
||||||
|
result = 0
|
||||||
|
|
||||||
|
elif condition.aggregate_function == "min":
|
||||||
|
field_attr = getattr(Transaction, condition.field, None)
|
||||||
|
if field_attr:
|
||||||
|
result = query.with_entities(func.min(field_attr)).scalar() or 0
|
||||||
|
else:
|
||||||
|
result = 0
|
||||||
|
|
||||||
|
else:
|
||||||
|
raise ValueError(f"Unsupported aggregate function: {condition.aggregate_function}")
|
||||||
|
|
||||||
|
# Cache the result
|
||||||
|
aggregate_data = {
|
||||||
|
"value": result,
|
||||||
|
"function": condition.aggregate_function,
|
||||||
|
"field": condition.field,
|
||||||
|
"time_window": condition.time_window,
|
||||||
|
"group_by": condition.group_by,
|
||||||
|
"computed_at": datetime.utcnow().isoformat(),
|
||||||
|
"cache_key": cache_key
|
||||||
|
}
|
||||||
|
|
||||||
|
self._cache_aggregate(cache_key, aggregate_data, condition.time_window)
|
||||||
|
|
||||||
|
return result, aggregate_data
|
||||||
|
|
||||||
|
def _generate_cache_key(self, transaction: Transaction, condition: RuleCondition) -> str:
|
||||||
|
"""
|
||||||
|
Generate a cache key for aggregate computation.
|
||||||
|
"""
|
||||||
|
key_parts = [
|
||||||
|
condition.aggregate_function,
|
||||||
|
condition.field,
|
||||||
|
condition.time_window or "no_window"
|
||||||
|
]
|
||||||
|
|
||||||
|
if condition.group_by:
|
||||||
|
for group_field in condition.group_by:
|
||||||
|
group_value = self._get_transaction_field_value(transaction, group_field)
|
||||||
|
key_parts.append(f"{group_field}:{group_value}")
|
||||||
|
|
||||||
|
return ":".join(str(part) for part in key_parts)
|
||||||
|
|
||||||
|
def _get_cached_aggregate(self, cache_key: str) -> Optional[Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Retrieve cached aggregate result if not expired.
|
||||||
|
"""
|
||||||
|
cached = self.db.query(AggregateCache).filter(
|
||||||
|
AggregateCache.cache_key == cache_key,
|
||||||
|
AggregateCache.expires_at > datetime.utcnow()
|
||||||
|
).first()
|
||||||
|
|
||||||
|
if cached:
|
||||||
|
return json.loads(cached.cache_value)
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _cache_aggregate(self, cache_key: str, data: Dict[str, Any], time_window: Optional[str]):
|
||||||
|
"""
|
||||||
|
Cache aggregate result with appropriate expiration.
|
||||||
|
"""
|
||||||
|
# Determine cache expiration based on time window
|
||||||
|
if time_window:
|
||||||
|
time_delta = self._parse_time_window(time_window)
|
||||||
|
# Cache for 10% of the time window, minimum 1 minute, maximum 1 hour
|
||||||
|
cache_duration = max(min(time_delta * 0.1, timedelta(hours=1)), timedelta(minutes=1))
|
||||||
|
else:
|
||||||
|
cache_duration = timedelta(minutes=5) # Default 5 minutes
|
||||||
|
|
||||||
|
expires_at = datetime.utcnow() + cache_duration
|
||||||
|
|
||||||
|
# Upsert cache entry
|
||||||
|
existing = self.db.query(AggregateCache).filter(AggregateCache.cache_key == cache_key).first()
|
||||||
|
if existing:
|
||||||
|
existing.cache_value = json.dumps(data)
|
||||||
|
existing.expires_at = expires_at
|
||||||
|
else:
|
||||||
|
cache_entry = AggregateCache(
|
||||||
|
cache_key=cache_key,
|
||||||
|
cache_value=json.dumps(data),
|
||||||
|
expires_at=expires_at
|
||||||
|
)
|
||||||
|
self.db.add(cache_entry)
|
||||||
|
|
||||||
|
self.db.commit()
|
||||||
|
|
||||||
|
def _parse_time_window(self, time_window: str) -> timedelta:
|
||||||
|
"""
|
||||||
|
Parse time window string into timedelta.
|
||||||
|
Supports: 1h, 24h, 7d, 30d, etc.
|
||||||
|
"""
|
||||||
|
if time_window.endswith('h'):
|
||||||
|
hours = int(time_window[:-1])
|
||||||
|
return timedelta(hours=hours)
|
||||||
|
elif time_window.endswith('d'):
|
||||||
|
days = int(time_window[:-1])
|
||||||
|
return timedelta(days=days)
|
||||||
|
elif time_window.endswith('m'):
|
||||||
|
minutes = int(time_window[:-1])
|
||||||
|
return timedelta(minutes=minutes)
|
||||||
|
else:
|
||||||
|
raise ValueError(f"Unsupported time window format: {time_window}")
|
||||||
|
|
||||||
|
def _compare_values(self, left: Any, operator: str, right: Any) -> bool:
|
||||||
|
"""
|
||||||
|
Compare two values using the specified operator.
|
||||||
|
"""
|
||||||
|
if left is None:
|
||||||
|
return False
|
||||||
|
|
||||||
|
try:
|
||||||
|
if operator == "eq":
|
||||||
|
return left == right
|
||||||
|
elif operator == "ne":
|
||||||
|
return left != right
|
||||||
|
elif operator == "gt":
|
||||||
|
return float(left) > float(right)
|
||||||
|
elif operator == "gte":
|
||||||
|
return float(left) >= float(right)
|
||||||
|
elif operator == "lt":
|
||||||
|
return float(left) < float(right)
|
||||||
|
elif operator == "lte":
|
||||||
|
return float(left) <= float(right)
|
||||||
|
elif operator == "in":
|
||||||
|
return left in right
|
||||||
|
elif operator == "not_in":
|
||||||
|
return left not in right
|
||||||
|
elif operator == "contains":
|
||||||
|
return str(right).lower() in str(left).lower()
|
||||||
|
elif operator == "starts_with":
|
||||||
|
return str(left).lower().startswith(str(right).lower())
|
||||||
|
elif operator == "ends_with":
|
||||||
|
return str(left).lower().endswith(str(right).lower())
|
||||||
|
else:
|
||||||
|
raise ValueError(f"Unsupported operator: {operator}")
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
return False
|
||||||
|
|
||||||
|
def _calculate_risk_score(self, actions: List[Dict[str, Any]]) -> float:
|
||||||
|
"""
|
||||||
|
Calculate risk score based on rule actions.
|
||||||
|
"""
|
||||||
|
max_score = 0.0
|
||||||
|
|
||||||
|
for action in actions:
|
||||||
|
if action.get("action_type") == "score":
|
||||||
|
score = action.get("parameters", {}).get("risk_score", 0.0)
|
||||||
|
max_score = max(max_score, score)
|
||||||
|
elif action.get("action_type") == "flag":
|
||||||
|
score = action.get("parameters", {}).get("risk_score", 50.0)
|
||||||
|
max_score = max(max_score, score)
|
||||||
|
elif action.get("action_type") == "block":
|
||||||
|
score = action.get("parameters", {}).get("risk_score", 100.0)
|
||||||
|
max_score = max(max_score, score)
|
||||||
|
|
||||||
|
return max_score
|
||||||
|
|
||||||
|
def cleanup_expired_cache(self):
|
||||||
|
"""
|
||||||
|
Remove expired cache entries.
|
||||||
|
"""
|
||||||
|
self.db.query(AggregateCache).filter(
|
||||||
|
AggregateCache.expires_at < datetime.utcnow()
|
||||||
|
).delete()
|
||||||
|
self.db.commit()
|
36
main.py
Normal file
36
main.py
Normal file
@ -0,0 +1,36 @@
|
|||||||
|
from fastapi import FastAPI
|
||||||
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
|
from app.api.routes import router as api_router
|
||||||
|
|
||||||
|
app = FastAPI(
|
||||||
|
title="Transaction Fraud Monitoring API",
|
||||||
|
description="API-driven transaction monitoring system for fraud detection",
|
||||||
|
version="1.0.0",
|
||||||
|
openapi_url="/openapi.json"
|
||||||
|
)
|
||||||
|
|
||||||
|
app.add_middleware(
|
||||||
|
CORSMiddleware,
|
||||||
|
allow_origins=["*"],
|
||||||
|
allow_credentials=True,
|
||||||
|
allow_methods=["*"],
|
||||||
|
allow_headers=["*"],
|
||||||
|
)
|
||||||
|
|
||||||
|
app.include_router(api_router, prefix="/api/v1")
|
||||||
|
|
||||||
|
@app.get("/")
|
||||||
|
async def root():
|
||||||
|
return {
|
||||||
|
"title": "Transaction Fraud Monitoring API",
|
||||||
|
"documentation": "/docs",
|
||||||
|
"health": "/health"
|
||||||
|
}
|
||||||
|
|
||||||
|
@app.get("/health")
|
||||||
|
async def health_check():
|
||||||
|
return {"status": "healthy", "service": "transaction-fraud-monitoring"}
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
import uvicorn
|
||||||
|
uvicorn.run(app, host="0.0.0.0", port=8000)
|
42
openapi.json
Normal file
42
openapi.json
Normal file
@ -0,0 +1,42 @@
|
|||||||
|
{
|
||||||
|
"openapi": "3.0.2",
|
||||||
|
"info": {
|
||||||
|
"title": "Transaction Fraud Monitoring API",
|
||||||
|
"description": "API-driven transaction monitoring system for fraud detection",
|
||||||
|
"version": "1.0.0"
|
||||||
|
},
|
||||||
|
"paths": {
|
||||||
|
"/": {
|
||||||
|
"get": {
|
||||||
|
"summary": "Root",
|
||||||
|
"operationId": "root__get",
|
||||||
|
"responses": {
|
||||||
|
"200": {
|
||||||
|
"description": "Successful Response",
|
||||||
|
"content": {
|
||||||
|
"application/json": {
|
||||||
|
"schema": {}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"/health": {
|
||||||
|
"get": {
|
||||||
|
"summary": "Health Check",
|
||||||
|
"operationId": "health_check_health_get",
|
||||||
|
"responses": {
|
||||||
|
"200": {
|
||||||
|
"description": "Successful Response",
|
||||||
|
"content": {
|
||||||
|
"application/json": {
|
||||||
|
"schema": {}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
9
requirements.txt
Normal file
9
requirements.txt
Normal file
@ -0,0 +1,9 @@
|
|||||||
|
fastapi==0.104.1
|
||||||
|
uvicorn[standard]==0.24.0
|
||||||
|
sqlalchemy==2.0.23
|
||||||
|
alembic==1.13.1
|
||||||
|
pydantic==2.5.0
|
||||||
|
python-multipart==0.0.6
|
||||||
|
python-jose[cryptography]==3.3.0
|
||||||
|
passlib[bcrypt]==1.7.4
|
||||||
|
ruff==0.1.7
|
Loading…
x
Reference in New Issue
Block a user