Implement Solana Arbitrage Detector with FastAPI
This commit is contained in:
parent
f1bb650a56
commit
d1432e150e
93
README.md
93
README.md
@ -1,3 +1,92 @@
|
||||
# FastAPI Application
|
||||
# Solana Arbitrage Detector
|
||||
|
||||
This is a FastAPI application bootstrapped by BackendIM, the AI-powered backend generation platform.
|
||||
A FastAPI-based service that scans the Solana blockchain for arbitrage transactions. The detector analyzes recent blocks to identify and track potential arbitrage opportunities.
|
||||
|
||||
## Features
|
||||
|
||||
- **Blockchain Scanning**: Monitors the latest Solana blocks for transactions
|
||||
- **Arbitrage Detection**: Analyzes transactions to identify potential arbitrage patterns
|
||||
- **Database Storage**: Persists blocks, transactions, and arbitrage events
|
||||
- **REST API**: Provides endpoints to access arbitrage data and control scanning
|
||||
- **Background Processing**: Automatic periodic scanning of the blockchain
|
||||
- **Health Monitoring**: Health check endpoint for system monitoring
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### Arbitrage API
|
||||
|
||||
- `POST /api/v1/arbitrage/scan` - Trigger a blockchain scan for arbitrage
|
||||
- `GET /api/v1/arbitrage/status` - Get current scan status
|
||||
- `GET /api/v1/arbitrage/events` - List detected arbitrage events
|
||||
- `GET /api/v1/arbitrage/events/{event_id}` - Get details of a specific arbitrage event
|
||||
|
||||
### Health Check
|
||||
|
||||
- `GET /health` - Check application health status
|
||||
|
||||
## Setup and Configuration
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Python 3.8+
|
||||
- SQLite database
|
||||
|
||||
### Installation
|
||||
|
||||
1. Clone the repository
|
||||
2. Install dependencies:
|
||||
```
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
### Environment Variables
|
||||
|
||||
The application can be configured using the following environment variables:
|
||||
|
||||
- `SOLANA_RPC_URL` - Solana RPC URL (default: "https://api.mainnet-beta.solana.com")
|
||||
- `SOLANA_BLOCKS_TO_SCAN` - Number of blocks to scan (default: 10)
|
||||
- `SCAN_INTERVAL_SECONDS` - Interval between automatic scans in seconds (default: 60)
|
||||
|
||||
### Running the Application
|
||||
|
||||
1. Start the application:
|
||||
```
|
||||
uvicorn main:app --host 0.0.0.0 --port 8000
|
||||
```
|
||||
|
||||
2. Access the API documentation at `http://localhost:8000/docs`
|
||||
|
||||
## Implementation Details
|
||||
|
||||
### Arbitrage Detection
|
||||
|
||||
The current implementation uses a simple pattern-matching approach to identify potential arbitrage transactions:
|
||||
|
||||
1. Looks for transactions with multiple token swap operations
|
||||
2. Identifies circular patterns (A → B → C → A)
|
||||
3. Assigns a confidence score based on pattern recognition
|
||||
|
||||
In a production environment, this would be enhanced with:
|
||||
- Token price data integration
|
||||
- More sophisticated pattern matching
|
||||
- Profit calculation algorithms
|
||||
- Machine learning-based classification
|
||||
|
||||
### Database Schema
|
||||
|
||||
- `blocks` - Stores block data from the Solana blockchain
|
||||
- `transactions` - Records transactions from scanned blocks
|
||||
- `token_transfers` - Tracks token transfers within transactions
|
||||
- `arbitrage_events` - Stores detected arbitrage events with metadata
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
- Real-time profit calculation using token price feeds
|
||||
- Support for specific DEX protocols (Raydium, Orca, Jupiter, etc.)
|
||||
- User notifications for arbitrage opportunities
|
||||
- Historical analytics and trend visualization
|
||||
- Machine learning for improved detection accuracy
|
||||
|
||||
## License
|
||||
|
||||
[MIT License](LICENSE)
|
84
alembic.ini
Normal file
84
alembic.ini
Normal file
@ -0,0 +1,84 @@
|
||||
# A generic, single database configuration.
|
||||
|
||||
[alembic]
|
||||
# path to migration scripts
|
||||
script_location = migrations
|
||||
|
||||
# template used to generate migration files
|
||||
# file_template = %%(rev)s_%%(slug)s
|
||||
|
||||
# timezone to use when rendering the date
|
||||
# within the migration file as well as the filename.
|
||||
# string value is passed to dateutil.tz.gettz()
|
||||
# leave blank for localtime
|
||||
# timezone =
|
||||
|
||||
# max length of characters to apply to the
|
||||
# "slug" field
|
||||
#truncate_slug_length = 40
|
||||
|
||||
# set to 'true' to run the environment during
|
||||
# the 'revision' command, regardless of autogenerate
|
||||
# revision_environment = false
|
||||
|
||||
# set to 'true' to allow .pyc and .pyo files without
|
||||
# a source .py file to be detected as revisions in the
|
||||
# versions/ directory
|
||||
# sourceless = false
|
||||
|
||||
# version location specification; this defaults
|
||||
# to alembic/versions. When using multiple version
|
||||
# directories, initial revisions must be specified with --version-path
|
||||
# version_locations = %(here)s/bar %(here)s/bat alembic/versions
|
||||
|
||||
# the output encoding used when revision files
|
||||
# are written from script.py.mako
|
||||
# output_encoding = utf-8
|
||||
|
||||
sqlalchemy.url = sqlite:////app/storage/db/db.sqlite
|
||||
|
||||
[post_write_hooks]
|
||||
# post_write_hooks defines scripts or Python functions that are run
|
||||
# on newly generated revision scripts. See the documentation for further
|
||||
# detail and examples
|
||||
|
||||
# format using "black" - use the console_scripts runner, against the "black" entrypoint
|
||||
# hooks=black
|
||||
# black.type=console_scripts
|
||||
# black.entrypoint=black
|
||||
# black.options=-l 79
|
||||
|
||||
# Logging configuration
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
[handlers]
|
||||
keys = console
|
||||
|
||||
[formatters]
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARN
|
||||
handlers = console
|
||||
qualname =
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
[handler_console]
|
||||
class = StreamHandler
|
||||
args = (sys.stderr,)
|
||||
level = NOTSET
|
||||
formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
datefmt = %H:%M:%S
|
1
app/__init__.py
Normal file
1
app/__init__.py
Normal file
@ -0,0 +1 @@
|
||||
# App package
|
1
app/api/__init__.py
Normal file
1
app/api/__init__.py
Normal file
@ -0,0 +1 @@
|
||||
# API package
|
1
app/api/v1/__init__.py
Normal file
1
app/api/v1/__init__.py
Normal file
@ -0,0 +1 @@
|
||||
# API v1 package
|
7
app/api/v1/api.py
Normal file
7
app/api/v1/api.py
Normal file
@ -0,0 +1,7 @@
|
||||
from fastapi import APIRouter
|
||||
|
||||
from app.api.v1.endpoints import arbitrage
|
||||
|
||||
api_router = APIRouter()
|
||||
|
||||
api_router.include_router(arbitrage.router, prefix="/arbitrage", tags=["arbitrage"])
|
1
app/api/v1/endpoints/__init__.py
Normal file
1
app/api/v1/endpoints/__init__.py
Normal file
@ -0,0 +1 @@
|
||||
# Endpoints module
|
124
app/api/v1/endpoints/arbitrage.py
Normal file
124
app/api/v1/endpoints/arbitrage.py
Normal file
@ -0,0 +1,124 @@
|
||||
from typing import List
|
||||
|
||||
from fastapi import APIRouter, BackgroundTasks, Depends, HTTPException
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from app.db.session import get_db
|
||||
from app.models.models import ArbitrageEvent
|
||||
from app.schemas.schemas import (ArbitrageEventResponse, ScanRequest,
|
||||
ScanStatusResponse)
|
||||
from app.services.scanner import BlockScanner
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
@router.post("/scan", response_model=ScanStatusResponse)
|
||||
async def scan_for_arbitrage(
|
||||
background_tasks: BackgroundTasks,
|
||||
scan_request: ScanRequest,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Scan the blockchain for arbitrage transactions
|
||||
"""
|
||||
scanner = BlockScanner(db)
|
||||
|
||||
# Add scanning task to background tasks
|
||||
background_tasks.add_task(
|
||||
scanner.scan_blocks,
|
||||
num_blocks=scan_request.num_blocks,
|
||||
start_slot=scan_request.start_slot,
|
||||
)
|
||||
|
||||
# Return current scan status
|
||||
status = scanner.get_scan_status()
|
||||
return ScanStatusResponse(**status)
|
||||
|
||||
|
||||
@router.get("/status", response_model=ScanStatusResponse)
|
||||
async def get_scan_status(
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Get the current status of the arbitrage scanner
|
||||
"""
|
||||
scanner = BlockScanner(db)
|
||||
status = scanner.get_scan_status()
|
||||
return ScanStatusResponse(**status)
|
||||
|
||||
|
||||
@router.get("/events", response_model=List[ArbitrageEventResponse])
|
||||
async def get_arbitrage_events(
|
||||
skip: int = 0,
|
||||
limit: int = 100,
|
||||
min_confidence: float = 0.0,
|
||||
token_address: str = None,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Get detected arbitrage events
|
||||
"""
|
||||
query = db.query(ArbitrageEvent)
|
||||
|
||||
# Apply filters
|
||||
if min_confidence > 0:
|
||||
query = query.filter(ArbitrageEvent.confidence_score >= min_confidence)
|
||||
|
||||
if token_address:
|
||||
query = query.filter(ArbitrageEvent.profit_token_address == token_address)
|
||||
|
||||
# Get paginated results
|
||||
events = query.order_by(ArbitrageEvent.detected_at.desc()).offset(skip).limit(limit).all()
|
||||
|
||||
# Convert to response format
|
||||
result = []
|
||||
for event in events:
|
||||
# Get transaction signature and block slot
|
||||
transaction = event.transaction
|
||||
block_slot = transaction.block_id if transaction else 0
|
||||
|
||||
result.append(
|
||||
ArbitrageEventResponse(
|
||||
id=event.id,
|
||||
transaction_signature=transaction.signature if transaction else "",
|
||||
profit_token_address=event.profit_token_address,
|
||||
profit_amount=event.profit_amount,
|
||||
profit_usd=event.profit_usd,
|
||||
path=event.path,
|
||||
confidence_score=event.confidence_score,
|
||||
detected_at=event.detected_at,
|
||||
block_slot=block_slot,
|
||||
)
|
||||
)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
@router.get("/events/{event_id}", response_model=ArbitrageEventResponse)
|
||||
async def get_arbitrage_event(
|
||||
event_id: int,
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""
|
||||
Get a specific arbitrage event by ID
|
||||
"""
|
||||
event = db.query(ArbitrageEvent).filter(ArbitrageEvent.id == event_id).first()
|
||||
|
||||
if not event:
|
||||
raise HTTPException(status_code=404, detail="Arbitrage event not found")
|
||||
|
||||
# Get transaction signature and block slot
|
||||
transaction = event.transaction
|
||||
block_slot = transaction.block_id if transaction else 0
|
||||
|
||||
return ArbitrageEventResponse(
|
||||
id=event.id,
|
||||
transaction_signature=transaction.signature if transaction else "",
|
||||
profit_token_address=event.profit_token_address,
|
||||
profit_amount=event.profit_amount,
|
||||
profit_usd=event.profit_usd,
|
||||
path=event.path,
|
||||
confidence_score=event.confidence_score,
|
||||
detected_at=event.detected_at,
|
||||
block_slot=block_slot,
|
||||
)
|
42
app/core/config.py
Normal file
42
app/core/config.py
Normal file
@ -0,0 +1,42 @@
|
||||
from pathlib import Path
|
||||
from typing import List, Union
|
||||
|
||||
from pydantic import AnyHttpUrl, validator
|
||||
from pydantic_settings import BaseSettings
|
||||
|
||||
|
||||
class Settings(BaseSettings):
|
||||
API_V1_STR: str = "/api/v1"
|
||||
PROJECT_NAME: str = "Solana Arbitrage Detector"
|
||||
|
||||
# CORS settings
|
||||
BACKEND_CORS_ORIGINS: List[AnyHttpUrl] = []
|
||||
|
||||
@validator("BACKEND_CORS_ORIGINS", pre=True)
|
||||
def assemble_cors_origins(cls, v: Union[str, List[str]]) -> Union[List[str], str]:
|
||||
if isinstance(v, str) and not v.startswith("["):
|
||||
return [i.strip() for i in v.split(",")]
|
||||
elif isinstance(v, (list, str)):
|
||||
return v
|
||||
raise ValueError(v)
|
||||
|
||||
# Solana RPC settings
|
||||
SOLANA_RPC_URL: str = "https://api.mainnet-beta.solana.com"
|
||||
SOLANA_BLOCKS_TO_SCAN: int = 10
|
||||
|
||||
# Background task settings
|
||||
SCAN_INTERVAL_SECONDS: int = 60
|
||||
|
||||
# Database settings
|
||||
DB_DIR: Path = Path("/app") / "storage" / "db"
|
||||
SQLALCHEMY_DATABASE_URL: str = f"sqlite:///{DB_DIR}/db.sqlite"
|
||||
|
||||
class Config:
|
||||
case_sensitive = True
|
||||
env_file = ".env"
|
||||
|
||||
|
||||
settings = Settings()
|
||||
|
||||
# Ensure DB directory exists
|
||||
settings.DB_DIR.mkdir(parents=True, exist_ok=True)
|
82
app/core/scheduler.py
Normal file
82
app/core/scheduler.py
Normal file
@ -0,0 +1,82 @@
|
||||
import logging
|
||||
from datetime import datetime
|
||||
|
||||
from apscheduler.schedulers.background import BackgroundScheduler
|
||||
from apscheduler.triggers.interval import IntervalTrigger
|
||||
|
||||
from app.core.config import settings
|
||||
from app.db.session import SessionLocal
|
||||
from app.services.scanner import BlockScanner
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class BlockScanScheduler:
|
||||
"""
|
||||
Scheduler for periodic blockchain scanning
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
self.scheduler = BackgroundScheduler()
|
||||
self.scan_job = None
|
||||
|
||||
def start(self):
|
||||
"""
|
||||
Start the scheduler
|
||||
"""
|
||||
if self.scan_job is None:
|
||||
# Add job for scanning blocks periodically
|
||||
self.scan_job = self.scheduler.add_job(
|
||||
self._scan_blocks,
|
||||
IntervalTrigger(seconds=settings.SCAN_INTERVAL_SECONDS),
|
||||
id="scan_blocks",
|
||||
name="Periodic Block Scan",
|
||||
replace_existing=True,
|
||||
)
|
||||
|
||||
logger.info(f"Scheduled block scanning every {settings.SCAN_INTERVAL_SECONDS} seconds")
|
||||
|
||||
# Start the scheduler
|
||||
self.scheduler.start()
|
||||
logger.info("Block scan scheduler started")
|
||||
|
||||
def shutdown(self):
|
||||
"""
|
||||
Shutdown the scheduler
|
||||
"""
|
||||
if self.scheduler.running:
|
||||
self.scheduler.shutdown()
|
||||
logger.info("Block scan scheduler shut down")
|
||||
|
||||
def _scan_blocks(self):
|
||||
"""
|
||||
Scan blocks for arbitrage transactions
|
||||
"""
|
||||
logger.info("Starting periodic block scan")
|
||||
|
||||
db = SessionLocal()
|
||||
try:
|
||||
scanner = BlockScanner(db)
|
||||
|
||||
# Skip if a scan is already in progress
|
||||
if scanner.scan_in_progress:
|
||||
logger.warning("Scan already in progress, skipping scheduled scan")
|
||||
return
|
||||
|
||||
# Scan latest blocks
|
||||
events = scanner.scan_blocks(settings.SOLANA_BLOCKS_TO_SCAN)
|
||||
|
||||
logger.info(
|
||||
f"Completed block scan at {datetime.utcnow().isoformat()}, "
|
||||
f"found {len(events)} arbitrage events"
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error during scheduled block scan: {str(e)}")
|
||||
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
|
||||
# Singleton instance
|
||||
scheduler = BlockScanScheduler()
|
25
app/db/session.py
Normal file
25
app/db/session.py
Normal file
@ -0,0 +1,25 @@
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.ext.declarative import declarative_base
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
|
||||
from app.core.config import settings
|
||||
|
||||
engine = create_engine(
|
||||
settings.SQLALCHEMY_DATABASE_URL,
|
||||
connect_args={"check_same_thread": False}
|
||||
)
|
||||
|
||||
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||
|
||||
Base = declarative_base()
|
||||
|
||||
|
||||
def get_db():
|
||||
"""
|
||||
Dependency for getting DB session
|
||||
"""
|
||||
db = SessionLocal()
|
||||
try:
|
||||
yield db
|
||||
finally:
|
||||
db.close()
|
0
app/models/base.py
Normal file
0
app/models/base.py
Normal file
66
app/models/models.py
Normal file
66
app/models/models.py
Normal file
@ -0,0 +1,66 @@
|
||||
from datetime import datetime
|
||||
|
||||
from sqlalchemy import (Boolean, Column, DateTime, Float, ForeignKey, Integer,
|
||||
String, Text)
|
||||
from sqlalchemy.orm import relationship
|
||||
|
||||
from app.db.session import Base
|
||||
|
||||
|
||||
class Block(Base):
|
||||
__tablename__ = "blocks"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
slot = Column(Integer, unique=True, index=True)
|
||||
blockhash = Column(String, unique=True, index=True)
|
||||
parent_blockhash = Column(String, index=True)
|
||||
timestamp = Column(DateTime, default=datetime.utcnow)
|
||||
processed = Column(Boolean, default=False)
|
||||
|
||||
transactions = relationship("Transaction", back_populates="block")
|
||||
|
||||
|
||||
class Transaction(Base):
|
||||
__tablename__ = "transactions"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
signature = Column(String, unique=True, index=True)
|
||||
block_id = Column(Integer, ForeignKey("blocks.slot"))
|
||||
timestamp = Column(DateTime, default=datetime.utcnow)
|
||||
fee = Column(Integer)
|
||||
status = Column(String)
|
||||
raw_data = Column(Text)
|
||||
|
||||
block = relationship("Block", back_populates="transactions")
|
||||
token_transfers = relationship("TokenTransfer", back_populates="transaction")
|
||||
arbitrage_events = relationship("ArbitrageEvent", back_populates="transaction")
|
||||
|
||||
|
||||
class TokenTransfer(Base):
|
||||
__tablename__ = "token_transfers"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
transaction_id = Column(Integer, ForeignKey("transactions.id"))
|
||||
token_address = Column(String, index=True)
|
||||
from_address = Column(String, index=True)
|
||||
to_address = Column(String, index=True)
|
||||
amount = Column(Float)
|
||||
program_id = Column(String, index=True)
|
||||
timestamp = Column(DateTime, default=datetime.utcnow)
|
||||
|
||||
transaction = relationship("Transaction", back_populates="token_transfers")
|
||||
|
||||
|
||||
class ArbitrageEvent(Base):
|
||||
__tablename__ = "arbitrage_events"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
transaction_id = Column(Integer, ForeignKey("transactions.id"))
|
||||
profit_token_address = Column(String, index=True)
|
||||
profit_amount = Column(Float)
|
||||
profit_usd = Column(Float, nullable=True)
|
||||
path = Column(Text) # JSON string of token paths
|
||||
confidence_score = Column(Float) # 0.0 to 1.0
|
||||
detected_at = Column(DateTime, default=datetime.utcnow)
|
||||
|
||||
transaction = relationship("Transaction", back_populates="arbitrage_events")
|
112
app/schemas/schemas.py
Normal file
112
app/schemas/schemas.py
Normal file
@ -0,0 +1,112 @@
|
||||
from datetime import datetime
|
||||
from typing import List, Optional
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
|
||||
class TokenTransferBase(BaseModel):
|
||||
token_address: str
|
||||
from_address: str
|
||||
to_address: str
|
||||
amount: float
|
||||
program_id: str
|
||||
|
||||
|
||||
class TokenTransferCreate(TokenTransferBase):
|
||||
transaction_id: int
|
||||
|
||||
|
||||
class TokenTransfer(TokenTransferBase):
|
||||
id: int
|
||||
transaction_id: int
|
||||
timestamp: datetime
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
|
||||
|
||||
class ArbitrageEventBase(BaseModel):
|
||||
profit_token_address: str
|
||||
profit_amount: float
|
||||
profit_usd: Optional[float] = None
|
||||
path: str
|
||||
confidence_score: float = Field(..., ge=0.0, le=1.0)
|
||||
|
||||
|
||||
class ArbitrageEventCreate(ArbitrageEventBase):
|
||||
transaction_id: int
|
||||
|
||||
|
||||
class ArbitrageEvent(ArbitrageEventBase):
|
||||
id: int
|
||||
transaction_id: int
|
||||
detected_at: datetime
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
|
||||
|
||||
class TransactionBase(BaseModel):
|
||||
signature: str
|
||||
fee: int
|
||||
status: str
|
||||
raw_data: str
|
||||
|
||||
|
||||
class TransactionCreate(TransactionBase):
|
||||
block_id: int
|
||||
|
||||
|
||||
class Transaction(TransactionBase):
|
||||
id: int
|
||||
block_id: int
|
||||
timestamp: datetime
|
||||
token_transfers: List[TokenTransfer] = []
|
||||
arbitrage_events: List[ArbitrageEvent] = []
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
|
||||
|
||||
class BlockBase(BaseModel):
|
||||
slot: int
|
||||
blockhash: str
|
||||
parent_blockhash: str
|
||||
processed: bool = False
|
||||
|
||||
|
||||
class BlockCreate(BlockBase):
|
||||
pass
|
||||
|
||||
|
||||
class Block(BlockBase):
|
||||
id: int
|
||||
timestamp: datetime
|
||||
transactions: List[Transaction] = []
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
|
||||
|
||||
class ArbitrageEventResponse(BaseModel):
|
||||
id: int
|
||||
transaction_signature: str
|
||||
profit_token_address: str
|
||||
profit_amount: float
|
||||
profit_usd: Optional[float] = None
|
||||
path: str
|
||||
confidence_score: float
|
||||
detected_at: datetime
|
||||
block_slot: int
|
||||
|
||||
|
||||
class ScanStatusResponse(BaseModel):
|
||||
last_scanned_block: int
|
||||
last_scan_time: datetime
|
||||
arbitrage_events_count: int
|
||||
scan_in_progress: bool
|
||||
|
||||
|
||||
class ScanRequest(BaseModel):
|
||||
num_blocks: int = Field(10, ge=1, le=100)
|
||||
start_slot: Optional[int] = None
|
266
app/services/scanner.py
Normal file
266
app/services/scanner.py
Normal file
@ -0,0 +1,266 @@
|
||||
import json
|
||||
import logging
|
||||
from datetime import datetime
|
||||
from typing import Dict, List, Optional, Tuple
|
||||
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from app.core.config import settings
|
||||
from app.models.models import ArbitrageEvent, Block, TokenTransfer, Transaction
|
||||
from app.schemas.schemas import (ArbitrageEventCreate, BlockCreate,
|
||||
TokenTransferCreate, TransactionCreate)
|
||||
from app.services.solana_client import SolanaClient
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class BlockScanner:
|
||||
"""
|
||||
Scanner service for processing Solana blocks and detecting arbitrage
|
||||
"""
|
||||
|
||||
def __init__(self, db: Session, solana_client: Optional[SolanaClient] = None):
|
||||
"""
|
||||
Initialize scanner with database session and Solana client
|
||||
"""
|
||||
self.db = db
|
||||
self.solana_client = solana_client or SolanaClient()
|
||||
self._scan_in_progress = False
|
||||
|
||||
@property
|
||||
def scan_in_progress(self) -> bool:
|
||||
"""
|
||||
Check if a scan is currently in progress
|
||||
"""
|
||||
return self._scan_in_progress
|
||||
|
||||
def get_last_scanned_block(self) -> Optional[int]:
|
||||
"""
|
||||
Get the latest block that was scanned
|
||||
"""
|
||||
latest_block = self.db.query(Block).order_by(Block.slot.desc()).first()
|
||||
if latest_block:
|
||||
return latest_block.slot
|
||||
return None
|
||||
|
||||
def save_block(self, block_data: Dict) -> Block:
|
||||
"""
|
||||
Save block data to database
|
||||
"""
|
||||
block_create = BlockCreate(
|
||||
slot=block_data["parentSlot"] + 1 if "parentSlot" in block_data else 0,
|
||||
blockhash=block_data.get("blockhash", ""),
|
||||
parent_blockhash=block_data.get("previousBlockhash", ""),
|
||||
processed=False,
|
||||
)
|
||||
|
||||
block = Block(**block_create.model_dump())
|
||||
self.db.add(block)
|
||||
self.db.commit()
|
||||
self.db.refresh(block)
|
||||
|
||||
return block
|
||||
|
||||
def save_transaction(self, tx_data: Dict, block_id: int) -> Transaction:
|
||||
"""
|
||||
Save transaction data to database
|
||||
"""
|
||||
signature = tx_data.get("transaction", {}).get("signatures", [""])[0]
|
||||
|
||||
tx_create = TransactionCreate(
|
||||
signature=signature,
|
||||
block_id=block_id,
|
||||
fee=tx_data.get("meta", {}).get("fee", 0),
|
||||
status="success" if tx_data.get("meta", {}).get("err") is None else "error",
|
||||
raw_data=json.dumps(tx_data),
|
||||
)
|
||||
|
||||
transaction = Transaction(**tx_create.model_dump())
|
||||
self.db.add(transaction)
|
||||
self.db.commit()
|
||||
self.db.refresh(transaction)
|
||||
|
||||
return transaction
|
||||
|
||||
def save_token_transfers(self, tx_data: Dict, transaction_id: int) -> List[TokenTransfer]:
|
||||
"""
|
||||
Extract and save token transfers from transaction data
|
||||
"""
|
||||
# This would require parsing the transaction logs and instruction data
|
||||
# For simplicity, we'll just create a placeholder function
|
||||
|
||||
# In a real implementation, we would extract token transfer info
|
||||
# from the transaction instructions and logs
|
||||
|
||||
transfers = []
|
||||
# Placeholder for demonstration purposes
|
||||
if "meta" in tx_data and "postTokenBalances" in tx_data["meta"] and "preTokenBalances" in tx_data["meta"]:
|
||||
post_balances = tx_data["meta"]["postTokenBalances"]
|
||||
pre_balances = tx_data["meta"]["preTokenBalances"]
|
||||
|
||||
# Compare pre and post balances to identify transfers
|
||||
# This is a simplified approach
|
||||
for post in post_balances:
|
||||
mint = post.get("mint", "")
|
||||
owner = post.get("owner", "")
|
||||
|
||||
# Find matching pre-balance
|
||||
pre = next((p for p in pre_balances if p.get("mint") == mint and p.get("owner") == owner), None)
|
||||
|
||||
if pre:
|
||||
# Calculate amount change
|
||||
pre_amount = float(pre.get("uiTokenAmount", {}).get("uiAmount", 0) or 0)
|
||||
post_amount = float(post.get("uiTokenAmount", {}).get("uiAmount", 0) or 0)
|
||||
amount_change = post_amount - pre_amount
|
||||
|
||||
if amount_change != 0:
|
||||
# Create token transfer record
|
||||
transfer_create = TokenTransferCreate(
|
||||
transaction_id=transaction_id,
|
||||
token_address=mint,
|
||||
from_address="" if amount_change > 0 else owner,
|
||||
to_address=owner if amount_change > 0 else "",
|
||||
amount=abs(amount_change),
|
||||
program_id=post.get("programId", ""),
|
||||
)
|
||||
|
||||
transfer = TokenTransfer(**transfer_create.model_dump())
|
||||
self.db.add(transfer)
|
||||
transfers.append(transfer)
|
||||
|
||||
if transfers:
|
||||
self.db.commit()
|
||||
for transfer in transfers:
|
||||
self.db.refresh(transfer)
|
||||
|
||||
return transfers
|
||||
|
||||
def save_arbitrage_event(self, arbitrage_data: Dict, transaction_id: int) -> ArbitrageEvent:
|
||||
"""
|
||||
Save detected arbitrage event to database
|
||||
"""
|
||||
event_create = ArbitrageEventCreate(
|
||||
transaction_id=transaction_id,
|
||||
profit_token_address=arbitrage_data["profit_token_address"],
|
||||
profit_amount=arbitrage_data["profit_amount"],
|
||||
profit_usd=arbitrage_data.get("profit_usd"),
|
||||
path=arbitrage_data["path"],
|
||||
confidence_score=arbitrage_data["confidence_score"],
|
||||
)
|
||||
|
||||
event = ArbitrageEvent(**event_create.model_dump())
|
||||
self.db.add(event)
|
||||
self.db.commit()
|
||||
self.db.refresh(event)
|
||||
|
||||
return event
|
||||
|
||||
def process_transaction(self, tx_data: Dict, block_id: int) -> Tuple[Transaction, Optional[ArbitrageEvent]]:
|
||||
"""
|
||||
Process a single transaction, looking for arbitrage
|
||||
"""
|
||||
# Save transaction record
|
||||
transaction = self.save_transaction(tx_data, block_id)
|
||||
|
||||
# Extract and save token transfers
|
||||
self.save_token_transfers(tx_data, transaction.id)
|
||||
|
||||
# Analyze for arbitrage
|
||||
is_arbitrage, arbitrage_data = self.solana_client.analyze_transaction_for_arbitrage(tx_data)
|
||||
|
||||
arbitrage_event = None
|
||||
if is_arbitrage and arbitrage_data:
|
||||
arbitrage_event = self.save_arbitrage_event(arbitrage_data, transaction.id)
|
||||
logger.info(f"Detected arbitrage in transaction {transaction.signature}")
|
||||
|
||||
return transaction, arbitrage_event
|
||||
|
||||
def process_block(self, block_data: Dict) -> Tuple[Block, List[ArbitrageEvent]]:
|
||||
"""
|
||||
Process a single block, looking for arbitrage transactions
|
||||
"""
|
||||
# Save block record
|
||||
block = self.save_block(block_data)
|
||||
|
||||
arbitrage_events = []
|
||||
|
||||
# Process each transaction in the block
|
||||
if "transactions" in block_data:
|
||||
for tx_data in block_data["transactions"]:
|
||||
_, arbitrage_event = self.process_transaction(tx_data, block.slot)
|
||||
if arbitrage_event:
|
||||
arbitrage_events.append(arbitrage_event)
|
||||
|
||||
# Mark block as processed
|
||||
block.processed = True
|
||||
self.db.commit()
|
||||
self.db.refresh(block)
|
||||
|
||||
return block, arbitrage_events
|
||||
|
||||
def scan_blocks(self, num_blocks: int = None, start_slot: int = None) -> List[ArbitrageEvent]:
|
||||
"""
|
||||
Scan the specified number of latest blocks for arbitrage
|
||||
"""
|
||||
if self._scan_in_progress:
|
||||
logger.warning("Scan already in progress, ignoring request")
|
||||
return []
|
||||
|
||||
self._scan_in_progress = True
|
||||
all_events = []
|
||||
|
||||
try:
|
||||
num_blocks = num_blocks or settings.SOLANA_BLOCKS_TO_SCAN
|
||||
|
||||
# Get latest block if start_slot not provided
|
||||
if start_slot is None:
|
||||
latest_block = self.solana_client.get_latest_block()
|
||||
# Get block at that slot
|
||||
block_data = self.solana_client.get_block(latest_block["lastValidBlockHeight"])
|
||||
start_slot = block_data["parentSlot"]
|
||||
|
||||
# Get list of blocks to scan
|
||||
end_slot = start_slot - num_blocks
|
||||
if end_slot < 0:
|
||||
end_slot = 0
|
||||
|
||||
blocks_to_scan = self.solana_client.get_blocks(end_slot, start_slot)
|
||||
|
||||
# Scan each block
|
||||
for slot in blocks_to_scan:
|
||||
try:
|
||||
block_data = self.solana_client.get_block(slot)
|
||||
_, events = self.process_block(block_data)
|
||||
all_events.extend(events)
|
||||
except Exception as e:
|
||||
logger.error(f"Error processing block {slot}: {str(e)}")
|
||||
continue
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error scanning blocks: {str(e)}")
|
||||
|
||||
finally:
|
||||
self._scan_in_progress = False
|
||||
|
||||
return all_events
|
||||
|
||||
def get_scan_status(self) -> Dict:
|
||||
"""
|
||||
Get the current status of the block scanner
|
||||
"""
|
||||
last_block = self.db.query(Block).order_by(Block.slot.desc()).first()
|
||||
last_scan_time = None
|
||||
last_scanned_block = 0
|
||||
|
||||
if last_block:
|
||||
last_scan_time = last_block.timestamp
|
||||
last_scanned_block = last_block.slot
|
||||
|
||||
arbitrage_count = self.db.query(ArbitrageEvent).count()
|
||||
|
||||
return {
|
||||
"last_scanned_block": last_scanned_block,
|
||||
"last_scan_time": last_scan_time or datetime.utcnow(),
|
||||
"arbitrage_events_count": arbitrage_count,
|
||||
"scan_in_progress": self._scan_in_progress,
|
||||
}
|
169
app/services/solana_client.py
Normal file
169
app/services/solana_client.py
Normal file
@ -0,0 +1,169 @@
|
||||
import json
|
||||
import logging
|
||||
from typing import Dict, List, Optional, Tuple
|
||||
|
||||
from solana.rpc.api import Client
|
||||
from solana.rpc.types import MemcmpOpts
|
||||
|
||||
from app.core.config import settings
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class SolanaClient:
|
||||
"""
|
||||
Client for interacting with Solana blockchain
|
||||
"""
|
||||
|
||||
def __init__(self, rpc_url: str = None):
|
||||
"""
|
||||
Initialize Solana client with RPC URL
|
||||
"""
|
||||
self.rpc_url = rpc_url or settings.SOLANA_RPC_URL
|
||||
self.client = Client(self.rpc_url)
|
||||
|
||||
def get_latest_block(self) -> Dict:
|
||||
"""
|
||||
Get latest finalized block
|
||||
"""
|
||||
response = self.client.get_latest_blockhash()
|
||||
if "error" in response:
|
||||
logger.error(f"Error getting latest block: {response['error']}")
|
||||
raise Exception(f"Error getting latest block: {response['error']}")
|
||||
|
||||
return response["result"]
|
||||
|
||||
def get_block(self, slot: int) -> Dict:
|
||||
"""
|
||||
Get block data by slot number
|
||||
"""
|
||||
response = self.client.get_block(
|
||||
slot,
|
||||
encoding="json",
|
||||
max_supported_transaction_version=0,
|
||||
transaction_details="full",
|
||||
)
|
||||
|
||||
if "error" in response:
|
||||
logger.error(f"Error getting block {slot}: {response['error']}")
|
||||
raise Exception(f"Error getting block {slot}: {response['error']}")
|
||||
|
||||
return response["result"]
|
||||
|
||||
def get_blocks(self, start_slot: int, end_slot: int = None) -> List[int]:
|
||||
"""
|
||||
Get a list of confirmed blocks
|
||||
"""
|
||||
if end_slot is None:
|
||||
# Get latest slot if end_slot not provided
|
||||
response = self.client.get_slot()
|
||||
if "error" in response:
|
||||
logger.error(f"Error getting latest slot: {response['error']}")
|
||||
raise Exception(f"Error getting latest slot: {response['error']}")
|
||||
|
||||
end_slot = response["result"]
|
||||
|
||||
response = self.client.get_blocks(start_slot, end_slot)
|
||||
|
||||
if "error" in response:
|
||||
logger.error(f"Error getting blocks from {start_slot} to {end_slot}: {response['error']}")
|
||||
raise Exception(f"Error getting blocks from {start_slot} to {end_slot}: {response['error']}")
|
||||
|
||||
return response["result"]
|
||||
|
||||
def get_transaction(self, signature: str) -> Dict:
|
||||
"""
|
||||
Get transaction details by signature
|
||||
"""
|
||||
response = self.client.get_transaction(
|
||||
signature,
|
||||
encoding="json",
|
||||
max_supported_transaction_version=0,
|
||||
)
|
||||
|
||||
if "error" in response:
|
||||
logger.error(f"Error getting transaction {signature}: {response['error']}")
|
||||
raise Exception(f"Error getting transaction {signature}: {response['error']}")
|
||||
|
||||
return response["result"]
|
||||
|
||||
def get_token_accounts_by_owner(self, owner: str, program_id: str = None) -> List[Dict]:
|
||||
"""
|
||||
Get token accounts by owner
|
||||
"""
|
||||
filters = []
|
||||
if program_id:
|
||||
filters.append(
|
||||
MemcmpOpts(
|
||||
offset=0,
|
||||
bytes=program_id,
|
||||
)
|
||||
)
|
||||
|
||||
response = self.client.get_token_accounts_by_owner(
|
||||
owner,
|
||||
filters,
|
||||
encoding="jsonParsed",
|
||||
)
|
||||
|
||||
if "error" in response:
|
||||
logger.error(f"Error getting token accounts for {owner}: {response['error']}")
|
||||
raise Exception(f"Error getting token accounts for {owner}: {response['error']}")
|
||||
|
||||
return response["result"]["value"]
|
||||
|
||||
def analyze_transaction_for_arbitrage(self, tx_data: Dict) -> Tuple[bool, Optional[Dict]]:
|
||||
"""
|
||||
Analyze transaction data for potential arbitrage
|
||||
|
||||
Returns a tuple of (is_arbitrage, arbitrage_data)
|
||||
"""
|
||||
# This is a simple placeholder implementation
|
||||
# A real implementation would require advanced pattern matching
|
||||
# and knowledge of Solana DEX structures
|
||||
|
||||
# Check if transaction has instructions
|
||||
if not tx_data or "transaction" not in tx_data:
|
||||
return False, None
|
||||
|
||||
tx = tx_data["transaction"]
|
||||
if "message" not in tx or "instructions" not in tx["message"]:
|
||||
return False, None
|
||||
|
||||
instructions = tx["message"]["instructions"]
|
||||
|
||||
# Check if transaction has multiple token swaps
|
||||
# This is a simplistic approach - real implementation would be more sophisticated
|
||||
swap_count = 0
|
||||
token_programs = [
|
||||
"9xQeWvG816bUx9EPjHmaT23yvVM2ZWbrrpZb9PusVFin", # Serum DEX v3
|
||||
"SwaPpA9LAaLfeLi3a68M4DjnLqgtticKg6CnyNwgAC8", # Swap Program
|
||||
"27haf8L6oxUeXrHrgEgsexjSY5hbVUWEmvv9Nyxg8vQv", # Jupiter Aggregator
|
||||
]
|
||||
|
||||
token_addresses = []
|
||||
for inst in instructions:
|
||||
if inst.get("programId") in token_programs:
|
||||
swap_count += 1
|
||||
# In a real implementation, we'd parse the instruction data
|
||||
# to identify the tokens being swapped
|
||||
|
||||
# For now, we'll just use a placeholder
|
||||
token_addresses.append(f"token_{swap_count}")
|
||||
|
||||
# Check for circular pattern (token A -> token B -> token C -> token A)
|
||||
if swap_count >= 3:
|
||||
# Check if first and last token are the same (circular)
|
||||
is_circular = len(token_addresses) >= 3 and token_addresses[0] == token_addresses[-1]
|
||||
|
||||
if is_circular:
|
||||
# In a real implementation, we'd calculate the actual profit
|
||||
# For now, return a placeholder
|
||||
return True, {
|
||||
"profit_token_address": token_addresses[0],
|
||||
"profit_amount": 0.01, # Placeholder
|
||||
"path": json.dumps(token_addresses),
|
||||
"confidence_score": 0.7, # Placeholder
|
||||
}
|
||||
|
||||
return False, None
|
70
main.py
Normal file
70
main.py
Normal file
@ -0,0 +1,70 @@
|
||||
import logging
|
||||
|
||||
import uvicorn
|
||||
from fastapi import FastAPI
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
|
||||
from app.api.v1.api import api_router
|
||||
from app.core.config import settings
|
||||
from app.core.scheduler import scheduler
|
||||
from app.models import base # noqa
|
||||
|
||||
# Configure logging
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
|
||||
)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
app = FastAPI(
|
||||
title=settings.PROJECT_NAME,
|
||||
description="Solana Arbitrage Detector API",
|
||||
version="0.1.0",
|
||||
openapi_url="/openapi.json",
|
||||
docs_url="/docs",
|
||||
redoc_url="/redoc",
|
||||
)
|
||||
|
||||
# Configure CORS
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=["*"],
|
||||
allow_credentials=True,
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
# Include API router
|
||||
app.include_router(api_router, prefix=settings.API_V1_STR)
|
||||
|
||||
# Health check endpoint
|
||||
@app.get("/health", tags=["health"])
|
||||
async def health_check():
|
||||
"""Health check endpoint"""
|
||||
return {"status": "ok"}
|
||||
|
||||
|
||||
@app.on_event("startup")
|
||||
async def startup_event():
|
||||
"""
|
||||
Startup event handler
|
||||
"""
|
||||
logger.info("Starting up application")
|
||||
|
||||
# Start the scheduler
|
||||
scheduler.start()
|
||||
|
||||
|
||||
@app.on_event("shutdown")
|
||||
async def shutdown_event():
|
||||
"""
|
||||
Shutdown event handler
|
||||
"""
|
||||
logger.info("Shutting down application")
|
||||
|
||||
# Shut down the scheduler
|
||||
scheduler.shutdown()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
uvicorn.run("main:app", host="0.0.0.0", port=8000, reload=True)
|
1
migrations/README
Normal file
1
migrations/README
Normal file
@ -0,0 +1 @@
|
||||
Generic single-database configuration with SQLAlchemy.
|
82
migrations/env.py
Normal file
82
migrations/env.py
Normal file
@ -0,0 +1,82 @@
|
||||
from logging.config import fileConfig
|
||||
|
||||
from alembic import context
|
||||
from sqlalchemy import engine_from_config, pool
|
||||
|
||||
from app.db.session import Base
|
||||
from app.models import models # noqa
|
||||
from app.core.config import settings
|
||||
|
||||
# this is the Alembic Config object, which provides
|
||||
# access to the values within the .ini file in use.
|
||||
config = context.config
|
||||
|
||||
# Interpret the config file for Python logging.
|
||||
# This line sets up loggers basically.
|
||||
fileConfig(config.config_file_name)
|
||||
|
||||
# add your model's MetaData object here
|
||||
# for 'autogenerate' support
|
||||
target_metadata = Base.metadata
|
||||
|
||||
# other values from the config, defined by the needs of env.py,
|
||||
# can be acquired:
|
||||
# my_important_option = config.get_main_option("my_important_option")
|
||||
# ... etc.
|
||||
|
||||
|
||||
def run_migrations_offline():
|
||||
"""Run migrations in 'offline' mode.
|
||||
|
||||
This configures the context with just a URL
|
||||
and not an Engine, though an Engine is acceptable
|
||||
here as well. By skipping the Engine creation
|
||||
we don't even need a DBAPI to be available.
|
||||
|
||||
Calls to context.execute() here emit the given string to the
|
||||
script output.
|
||||
|
||||
"""
|
||||
url = config.get_main_option("sqlalchemy.url")
|
||||
context.configure(
|
||||
url=url,
|
||||
target_metadata=target_metadata,
|
||||
literal_binds=True,
|
||||
dialect_opts={"paramstyle": "named"},
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
def run_migrations_online():
|
||||
"""Run migrations in 'online' mode.
|
||||
|
||||
In this scenario we need to create an Engine
|
||||
and associate a connection with the context.
|
||||
|
||||
"""
|
||||
configuration = config.get_section(config.config_ini_section)
|
||||
configuration["sqlalchemy.url"] = settings.SQLALCHEMY_DATABASE_URL
|
||||
connectable = engine_from_config(
|
||||
configuration,
|
||||
prefix="sqlalchemy.",
|
||||
poolclass=pool.NullPool,
|
||||
)
|
||||
|
||||
with connectable.connect() as connection:
|
||||
is_sqlite = connection.dialect.name == "sqlite"
|
||||
context.configure(
|
||||
connection=connection,
|
||||
target_metadata=target_metadata,
|
||||
render_as_batch=is_sqlite,
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
if context.is_offline_mode():
|
||||
run_migrations_offline()
|
||||
else:
|
||||
run_migrations_online()
|
24
migrations/script.py.mako
Normal file
24
migrations/script.py.mako
Normal file
@ -0,0 +1,24 @@
|
||||
"""${message}
|
||||
|
||||
Revision ID: ${up_revision}
|
||||
Revises: ${down_revision | comma,n}
|
||||
Create Date: ${create_date}
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
${imports if imports else ""}
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = ${repr(up_revision)}
|
||||
down_revision = ${repr(down_revision)}
|
||||
branch_labels = ${repr(branch_labels)}
|
||||
depends_on = ${repr(depends_on)}
|
||||
|
||||
|
||||
def upgrade():
|
||||
${upgrades if upgrades else "pass"}
|
||||
|
||||
|
||||
def downgrade():
|
||||
${downgrades if downgrades else "pass"}
|
110
migrations/versions/initial_migration.py
Normal file
110
migrations/versions/initial_migration.py
Normal file
@ -0,0 +1,110 @@
|
||||
"""initial migration
|
||||
|
||||
Revision ID: b61d5f5a1ded
|
||||
Revises:
|
||||
Create Date: 2023-10-05 12:00:00.000000
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = 'b61d5f5a1ded'
|
||||
down_revision = None
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
# Create blocks table
|
||||
op.create_table(
|
||||
'blocks',
|
||||
sa.Column('id', sa.Integer(), nullable=False),
|
||||
sa.Column('slot', sa.Integer(), nullable=False),
|
||||
sa.Column('blockhash', sa.String(), nullable=False),
|
||||
sa.Column('parent_blockhash', sa.String(), nullable=False),
|
||||
sa.Column('timestamp', sa.DateTime(), nullable=True),
|
||||
sa.Column('processed', sa.Boolean(), nullable=True),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
)
|
||||
op.create_index(op.f('ix_blocks_blockhash'), 'blocks', ['blockhash'], unique=True)
|
||||
op.create_index(op.f('ix_blocks_id'), 'blocks', ['id'], unique=False)
|
||||
op.create_index(op.f('ix_blocks_parent_blockhash'), 'blocks', ['parent_blockhash'], unique=False)
|
||||
op.create_index(op.f('ix_blocks_slot'), 'blocks', ['slot'], unique=True)
|
||||
|
||||
# Create transactions table
|
||||
op.create_table(
|
||||
'transactions',
|
||||
sa.Column('id', sa.Integer(), nullable=False),
|
||||
sa.Column('signature', sa.String(), nullable=False),
|
||||
sa.Column('block_id', sa.Integer(), nullable=True),
|
||||
sa.Column('timestamp', sa.DateTime(), nullable=True),
|
||||
sa.Column('fee', sa.Integer(), nullable=True),
|
||||
sa.Column('status', sa.String(), nullable=True),
|
||||
sa.Column('raw_data', sa.Text(), nullable=True),
|
||||
sa.ForeignKeyConstraint(['block_id'], ['blocks.slot'], ),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
)
|
||||
op.create_index(op.f('ix_transactions_id'), 'transactions', ['id'], unique=False)
|
||||
op.create_index(op.f('ix_transactions_signature'), 'transactions', ['signature'], unique=True)
|
||||
|
||||
# Create token_transfers table
|
||||
op.create_table(
|
||||
'token_transfers',
|
||||
sa.Column('id', sa.Integer(), nullable=False),
|
||||
sa.Column('transaction_id', sa.Integer(), nullable=True),
|
||||
sa.Column('token_address', sa.String(), nullable=True),
|
||||
sa.Column('from_address', sa.String(), nullable=True),
|
||||
sa.Column('to_address', sa.String(), nullable=True),
|
||||
sa.Column('amount', sa.Float(), nullable=True),
|
||||
sa.Column('program_id', sa.String(), nullable=True),
|
||||
sa.Column('timestamp', sa.DateTime(), nullable=True),
|
||||
sa.ForeignKeyConstraint(['transaction_id'], ['transactions.id'], ),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
)
|
||||
op.create_index(op.f('ix_token_transfers_from_address'), 'token_transfers', ['from_address'], unique=False)
|
||||
op.create_index(op.f('ix_token_transfers_id'), 'token_transfers', ['id'], unique=False)
|
||||
op.create_index(op.f('ix_token_transfers_program_id'), 'token_transfers', ['program_id'], unique=False)
|
||||
op.create_index(op.f('ix_token_transfers_to_address'), 'token_transfers', ['to_address'], unique=False)
|
||||
op.create_index(op.f('ix_token_transfers_token_address'), 'token_transfers', ['token_address'], unique=False)
|
||||
|
||||
# Create arbitrage_events table
|
||||
op.create_table(
|
||||
'arbitrage_events',
|
||||
sa.Column('id', sa.Integer(), nullable=False),
|
||||
sa.Column('transaction_id', sa.Integer(), nullable=True),
|
||||
sa.Column('profit_token_address', sa.String(), nullable=True),
|
||||
sa.Column('profit_amount', sa.Float(), nullable=True),
|
||||
sa.Column('profit_usd', sa.Float(), nullable=True),
|
||||
sa.Column('path', sa.Text(), nullable=True),
|
||||
sa.Column('confidence_score', sa.Float(), nullable=True),
|
||||
sa.Column('detected_at', sa.DateTime(), nullable=True),
|
||||
sa.ForeignKeyConstraint(['transaction_id'], ['transactions.id'], ),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
)
|
||||
op.create_index(op.f('ix_arbitrage_events_id'), 'arbitrage_events', ['id'], unique=False)
|
||||
op.create_index(op.f('ix_arbitrage_events_profit_token_address'), 'arbitrage_events', ['profit_token_address'], unique=False)
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_index(op.f('ix_arbitrage_events_profit_token_address'), table_name='arbitrage_events')
|
||||
op.drop_index(op.f('ix_arbitrage_events_id'), table_name='arbitrage_events')
|
||||
op.drop_table('arbitrage_events')
|
||||
|
||||
op.drop_index(op.f('ix_token_transfers_token_address'), table_name='token_transfers')
|
||||
op.drop_index(op.f('ix_token_transfers_to_address'), table_name='token_transfers')
|
||||
op.drop_index(op.f('ix_token_transfers_program_id'), table_name='token_transfers')
|
||||
op.drop_index(op.f('ix_token_transfers_id'), table_name='token_transfers')
|
||||
op.drop_index(op.f('ix_token_transfers_from_address'), table_name='token_transfers')
|
||||
op.drop_table('token_transfers')
|
||||
|
||||
op.drop_index(op.f('ix_transactions_signature'), table_name='transactions')
|
||||
op.drop_index(op.f('ix_transactions_id'), table_name='transactions')
|
||||
op.drop_table('transactions')
|
||||
|
||||
op.drop_index(op.f('ix_blocks_slot'), table_name='blocks')
|
||||
op.drop_index(op.f('ix_blocks_parent_blockhash'), table_name='blocks')
|
||||
op.drop_index(op.f('ix_blocks_id'), table_name='blocks')
|
||||
op.drop_index(op.f('ix_blocks_blockhash'), table_name='blocks')
|
||||
op.drop_table('blocks')
|
28
requirements.txt
Normal file
28
requirements.txt
Normal file
@ -0,0 +1,28 @@
|
||||
# FastAPI and server
|
||||
fastapi>=0.103.1
|
||||
uvicorn>=0.23.2
|
||||
pydantic>=2.4.2
|
||||
pydantic-settings>=2.0.3
|
||||
python-multipart>=0.0.6
|
||||
python-dotenv>=1.0.0
|
||||
|
||||
# Database
|
||||
sqlalchemy>=2.0.20
|
||||
alembic>=1.12.0
|
||||
|
||||
# Solana interaction
|
||||
solana>=0.30.2
|
||||
base58>=2.1.1
|
||||
borsh-construct>=0.2.0
|
||||
|
||||
# Background tasks and scheduling
|
||||
celery>=5.3.4
|
||||
redis>=5.0.0
|
||||
apscheduler>=3.10.4
|
||||
|
||||
# Testing
|
||||
pytest>=7.4.2
|
||||
httpx>=0.24.1
|
||||
|
||||
# Linting
|
||||
ruff>=0.0.286
|
Loading…
x
Reference in New Issue
Block a user