Create REST API with FastAPI and SQLite

- Implemented project structure with FastAPI framework
- Set up SQLite database with SQLAlchemy ORM
- Created Alembic for database migrations
- Implemented Item model and CRUD operations
- Added health check endpoint
- Added error handling
- Configured API documentation with Swagger UI

generated with BackendIM... (backend.im)
This commit is contained in:
Automated Action 2025-05-13 15:58:51 +00:00
parent 14c105961b
commit cd7c0162fe
18 changed files with 593 additions and 2 deletions

View File

@ -1,3 +1,84 @@
# FastAPI Application
# Generic REST API Service
This is a FastAPI application bootstrapped by BackendIM, the AI-powered backend generation platform.
A RESTful API service built with FastAPI and SQLite, providing standard CRUD operations with a clean architecture.
## Features
- FastAPI framework for high performance
- SQLite database with SQLAlchemy ORM
- Alembic for database migrations
- RESTful API endpoints
- Input validation with Pydantic
- Error handling
- Health check endpoint
- Interactive API documentation with Swagger
## Project Structure
```
├── alembic/ # Database migrations
│ └── versions/ # Migration scripts
├── app/ # Application code
│ ├── api/ # API endpoints
│ │ ├── endpoints/ # Individual API route handlers
│ │ └── routes.py # API router setup
│ ├── database.py # Database connection setup
│ ├── errors.py # Error handling
│ ├── models.py # SQLAlchemy models
│ └── schemas.py # Pydantic schemas
├── main.py # Application entry point
├── alembic.ini # Alembic configuration
└── requirements.txt # Project dependencies
```
## Installation
1. Clone the repository
2. Install dependencies:
```bash
pip install -r requirements.txt
```
## Usage
Run the application:
```bash
uvicorn main:app --reload
```
The API will be available at http://localhost:8000
## API Documentation
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
## API Endpoints
### Health Check
- `GET /health` - Check API health status
### Items
- `GET /items` - List all items (with optional filtering)
- `POST /items` - Create a new item
- `GET /items/{item_id}` - Get a specific item
- `PUT /items/{item_id}` - Update an item
- `DELETE /items/{item_id}` - Delete an item
## Database Management
This project uses Alembic for database migrations:
```bash
# Apply migrations
alembic upgrade head
# Create a new migration
alembic revision -m "description"
```
The SQLite database is stored at `/app/storage/db/db.sqlite`.

85
alembic.ini Normal file
View File

@ -0,0 +1,85 @@
# A generic, single database configuration.
[alembic]
# path to migration scripts
script_location = alembic
# template used to generate migration files
# file_template = %%(rev)s_%%(slug)s
# timezone to use when rendering the date
# within the migration file as well as the filename.
# string value is passed to dateutil.tz.gettz()
# leave blank for localtime
# timezone =
# max length of characters to apply to the
# "slug" field
# truncate_slug_length = 40
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false
# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false
# version location specification; this defaults
# to alembic/versions. When using multiple version
# directories, initial revisions must be specified with --version-path
# version_locations = %(here)s/bar %(here)s/bat alembic/versions
# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8
# SQLite URL example
sqlalchemy.url = sqlite:////app/storage/db/db.sqlite
[post_write_hooks]
# post_write_hooks defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples
# format using "black" - use the console_scripts runner, against the "black" entrypoint
# hooks=black
# black.type=console_scripts
# black.entrypoint=black
# black.options=-l 79
# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARN
handlers = console
qualname =
[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

1
alembic/README Normal file
View File

@ -0,0 +1 @@
Generic single-database configuration with SQLite.

78
alembic/env.py Normal file
View File

@ -0,0 +1,78 @@
from logging.config import fileConfig
from sqlalchemy import engine_from_config
from sqlalchemy import pool
from alembic import context
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
from app.models import Base
target_metadata = Base.metadata
# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
def run_migrations_offline():
"""Run migrations in 'offline' mode.
This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.
Calls to context.execute() here emit the given string to the
script output.
"""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
)
with context.begin_transaction():
context.run_migrations()
def run_migrations_online():
"""Run migrations in 'online' mode.
In this scenario we need to create an Engine
and associate a connection with the context.
"""
connectable = engine_from_config(
config.get_section(config.config_ini_section),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
with connectable.connect() as connection:
context.configure(
connection=connection, target_metadata=target_metadata
)
with context.begin_transaction():
context.run_migrations()
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()

24
alembic/script.py.mako Normal file
View File

@ -0,0 +1,24 @@
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from alembic import op
import sqlalchemy as sa
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision = ${repr(up_revision)}
down_revision = ${repr(down_revision)}
branch_labels = ${repr(branch_labels)}
depends_on = ${repr(depends_on)}
def upgrade():
${upgrades if upgrades else "pass"}
def downgrade():
${downgrades if downgrades else "pass"}

View File

@ -0,0 +1,39 @@
"""initial
Revision ID: 01_initial
Revises:
Create Date: 2025-05-13
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '01_initial'
down_revision = None
branch_labels = None
depends_on = None
def upgrade():
# Create items table
op.create_table(
'items',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(100), nullable=False),
sa.Column('description', sa.Text(), nullable=True),
sa.Column('price', sa.Integer(), nullable=False),
sa.Column('is_active', sa.Boolean(), default=True),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.func.now()),
sa.Column('updated_at', sa.DateTime(timezone=True), onupdate=sa.func.now()),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_items_id'), 'items', ['id'], unique=False)
op.create_index(op.f('ix_items_name'), 'items', ['name'], unique=False)
def downgrade():
op.drop_index(op.f('ix_items_name'), table_name='items')
op.drop_index(op.f('ix_items_id'), table_name='items')
op.drop_table('items')

1
app/__init__.py Normal file
View File

@ -0,0 +1 @@
# This file is intentionally left empty to make the directory a Python package.

1
app/api/__init__.py Normal file
View File

@ -0,0 +1 @@
# This file is intentionally left empty to make the directory a Python package.

View File

@ -0,0 +1 @@
# This file is intentionally left empty to make the directory a Python package.

View File

@ -0,0 +1,26 @@
from datetime import datetime
from fastapi import APIRouter, Depends, status
from sqlalchemy.orm import Session
from app.database import get_db
from app import schemas
router = APIRouter()
@router.get("/", response_model=schemas.HealthCheck)
def health_check(db: Session = Depends(get_db)):
"""
Health check endpoint
"""
# Verify database connection is working
try:
db.execute("SELECT 1")
db_status = "healthy"
except Exception:
db_status = "unhealthy"
return {
"status": db_status,
"timestamp": datetime.now()
}

View File

@ -0,0 +1,70 @@
from typing import List, Optional
from fastapi import APIRouter, Depends, HTTPException, Query, status
from sqlalchemy.orm import Session
from app.database import get_db
from app import models, schemas
router = APIRouter()
@router.post("/", response_model=schemas.Item, status_code=status.HTTP_201_CREATED)
def create_item(item: schemas.ItemCreate, db: Session = Depends(get_db)):
db_item = models.Item(**item.model_dump())
db.add(db_item)
db.commit()
db.refresh(db_item)
return db_item
@router.get("/", response_model=List[schemas.Item])
def read_items(
skip: int = 0,
limit: int = 100,
name: Optional[str] = None,
is_active: Optional[bool] = None,
db: Session = Depends(get_db),
):
query = db.query(models.Item)
if name:
query = query.filter(models.Item.name.contains(name))
if is_active is not None:
query = query.filter(models.Item.is_active == is_active)
items = query.offset(skip).limit(limit).all()
return items
@router.get("/{item_id}", response_model=schemas.Item)
def read_item(item_id: int, db: Session = Depends(get_db)):
db_item = db.query(models.Item).filter(models.Item.id == item_id).first()
if db_item is None:
raise HTTPException(status_code=404, detail="Item not found")
return db_item
@router.put("/{item_id}", response_model=schemas.Item)
def update_item(item_id: int, item: schemas.ItemUpdate, db: Session = Depends(get_db)):
db_item = db.query(models.Item).filter(models.Item.id == item_id).first()
if db_item is None:
raise HTTPException(status_code=404, detail="Item not found")
update_data = item.model_dump(exclude_unset=True)
for key, value in update_data.items():
setattr(db_item, key, value)
db.commit()
db.refresh(db_item)
return db_item
@router.delete("/{item_id}", status_code=status.HTTP_204_NO_CONTENT)
def delete_item(item_id: int, db: Session = Depends(get_db)):
db_item = db.query(models.Item).filter(models.Item.id == item_id).first()
if db_item is None:
raise HTTPException(status_code=404, detail="Item not found")
db.delete(db_item)
db.commit()
return None

7
app/api/routes.py Normal file
View File

@ -0,0 +1,7 @@
from fastapi import APIRouter
from app.api.endpoints import items, health
api_router = APIRouter()
api_router.include_router(health.router, prefix="/health", tags=["health"])
api_router.include_router(items.router, prefix="/items", tags=["items"])

31
app/database.py Normal file
View File

@ -0,0 +1,31 @@
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
from pathlib import Path
# Create the database directory if it doesn't exist
DB_DIR = Path("/app") / "storage" / "db"
DB_DIR.mkdir(parents=True, exist_ok=True)
# Define the SQLite URL
SQLALCHEMY_DATABASE_URL = f"sqlite:///{DB_DIR}/db.sqlite"
# Create the SQLAlchemy engine
engine = create_engine(
SQLALCHEMY_DATABASE_URL,
connect_args={"check_same_thread": False}
)
# Create a SessionLocal class
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
# Create a Base class for declarative models
Base = declarative_base()
# Dependency to get the database session
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()

53
app/errors.py Normal file
View File

@ -0,0 +1,53 @@
from fastapi import FastAPI, Request, status
from fastapi.responses import JSONResponse
class AppError(Exception):
"""Base error class for the application"""
def __init__(self, message: str, status_code: int):
self.message = message
self.status_code = status_code
super().__init__(self.message)
class NotFoundError(AppError):
"""Resource not found error"""
def __init__(self, message: str):
super().__init__(message, status_code=status.HTTP_404_NOT_FOUND)
class ValidationError(AppError):
"""Data validation error"""
def __init__(self, message: str):
super().__init__(message, status_code=status.HTTP_400_BAD_REQUEST)
class DatabaseError(AppError):
"""Database-related error"""
def __init__(self, message: str):
super().__init__(message, status_code=status.HTTP_500_INTERNAL_SERVER_ERROR)
def add_error_handlers(app: FastAPI) -> None:
"""Configure exception handlers for the app"""
@app.exception_handler(AppError)
async def app_error_handler(request: Request, exc: AppError) -> JSONResponse:
return JSONResponse(
status_code=exc.status_code,
content={"detail": exc.message}
)
@app.exception_handler(status.HTTP_404_NOT_FOUND)
async def not_found_handler(request: Request, exc: Exception) -> JSONResponse:
return JSONResponse(
status_code=status.HTTP_404_NOT_FOUND,
content={"detail": "Resource not found"}
)
@app.exception_handler(status.HTTP_500_INTERNAL_SERVER_ERROR)
async def server_error_handler(request: Request, exc: Exception) -> JSONResponse:
return JSONResponse(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
content={"detail": "Internal server error"}
)

15
app/models.py Normal file
View File

@ -0,0 +1,15 @@
from sqlalchemy import Column, Integer, String, Boolean, DateTime, Text
from sqlalchemy.sql import func
from .database import Base
class Item(Base):
__tablename__ = "items"
id = Column(Integer, primary_key=True, index=True)
name = Column(String(100), nullable=False, index=True)
description = Column(Text, nullable=True)
price = Column(Integer, nullable=False) # Price in cents
is_active = Column(Boolean, default=True)
created_at = Column(DateTime(timezone=True), server_default=func.now())
updated_at = Column(DateTime(timezone=True), onupdate=func.now())

36
app/schemas.py Normal file
View File

@ -0,0 +1,36 @@
from typing import Optional
from datetime import datetime
from pydantic import BaseModel, Field
class ItemBase(BaseModel):
name: str = Field(..., min_length=1, max_length=100)
description: Optional[str] = None
price: int = Field(..., gt=0) # Price in cents
is_active: bool = True
class ItemCreate(ItemBase):
pass
class ItemUpdate(BaseModel):
name: Optional[str] = Field(None, min_length=1, max_length=100)
description: Optional[str] = None
price: Optional[int] = Field(None, gt=0)
is_active: Optional[bool] = None
class Item(ItemBase):
id: int
created_at: datetime
updated_at: Optional[datetime] = None
class Config:
orm_mode = True
from_attributes = True
class HealthCheck(BaseModel):
status: str
timestamp: datetime

34
main.py Normal file
View File

@ -0,0 +1,34 @@
import uvicorn
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from app.api.routes import api_router
from app.database import Base, engine
from app.errors import add_error_handlers
# Create database tables
Base.metadata.create_all(bind=engine)
app = FastAPI(
title="Generic REST API Service",
description="A simple REST API service built with FastAPI and SQLite",
version="0.1.0",
)
# Set up CORS
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
# Add error handlers
add_error_handlers(app)
# Include API router
app.include_router(api_router)
if __name__ == "__main__":
uvicorn.run("main:app", host="0.0.0.0", port=8000, reload=True)

8
requirements.txt Normal file
View File

@ -0,0 +1,8 @@
fastapi==0.104.1
uvicorn==0.23.2
sqlalchemy==2.0.22
alembic==1.12.1
pydantic==2.4.2
pydantic-settings==2.0.3
python-dotenv==1.0.0
pathlib==1.0.1