Build complete task management tool with FastAPI
- Implemented full CRUD operations for tasks - Added SQLite database with SQLAlchemy ORM - Set up Alembic migrations for database schema - Created health check and base URL endpoints - Added CORS middleware for cross-origin requests - Included comprehensive API documentation - Updated README with complete project information
This commit is contained in:
parent
86c67fd619
commit
ecb15eb903
76
README.md
76
README.md
@ -1,3 +1,75 @@
|
|||||||
# FastAPI Application
|
# Task Management Tool
|
||||||
|
|
||||||
This is a FastAPI application bootstrapped by BackendIM, the AI-powered backend generation platform.
|
A FastAPI-based task management system that allows you to create, read, update, and delete tasks.
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
- **CRUD Operations**: Create, read, update, and delete tasks
|
||||||
|
- **Task Properties**: Each task has a title, description, completion status, and priority level
|
||||||
|
- **SQLite Database**: Data persistence using SQLite with SQLAlchemy ORM
|
||||||
|
- **API Documentation**: Automatic API documentation with Swagger UI
|
||||||
|
- **Health Check**: Built-in health check endpoint
|
||||||
|
- **CORS Support**: Cross-origin resource sharing enabled
|
||||||
|
|
||||||
|
## API Endpoints
|
||||||
|
|
||||||
|
### Base Endpoints
|
||||||
|
- `GET /` - API information and links
|
||||||
|
- `GET /health` - Health check endpoint
|
||||||
|
- `GET /docs` - Swagger UI documentation
|
||||||
|
- `GET /redoc` - ReDoc documentation
|
||||||
|
- `GET /openapi.json` - OpenAPI specification
|
||||||
|
|
||||||
|
### Task Endpoints
|
||||||
|
- `POST /tasks/` - Create a new task
|
||||||
|
- `GET /tasks/` - Get all tasks (with pagination)
|
||||||
|
- `GET /tasks/{task_id}` - Get a specific task by ID
|
||||||
|
- `PUT /tasks/{task_id}` - Update a specific task
|
||||||
|
- `DELETE /tasks/{task_id}` - Delete a specific task
|
||||||
|
|
||||||
|
## Installation and Setup
|
||||||
|
|
||||||
|
1. Install dependencies:
|
||||||
|
```bash
|
||||||
|
pip install -r requirements.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Run the application:
|
||||||
|
```bash
|
||||||
|
uvicorn main:app --host 0.0.0.0 --port 8000
|
||||||
|
```
|
||||||
|
|
||||||
|
The application will be available at http://localhost:8000
|
||||||
|
|
||||||
|
## Task Model
|
||||||
|
|
||||||
|
Each task has the following properties:
|
||||||
|
- `id`: Unique identifier (auto-generated)
|
||||||
|
- `title`: Task title (required)
|
||||||
|
- `description`: Task description (optional)
|
||||||
|
- `completed`: Completion status (default: false)
|
||||||
|
- `priority`: Priority level - "low", "medium", "high" (default: "medium")
|
||||||
|
- `created_at`: Creation timestamp (auto-generated)
|
||||||
|
- `updated_at`: Last update timestamp (auto-updated)
|
||||||
|
|
||||||
|
## Environment Variables
|
||||||
|
|
||||||
|
No environment variables are required for basic operation. The application uses SQLite database stored at `/app/storage/db/db.sqlite`.
|
||||||
|
|
||||||
|
## Database
|
||||||
|
|
||||||
|
The application uses SQLite database with Alembic for migrations. The database will be automatically created when the application starts.
|
||||||
|
|
||||||
|
## Development
|
||||||
|
|
||||||
|
- Database models are in `app/models/`
|
||||||
|
- API routes are in `app/routers/`
|
||||||
|
- Pydantic schemas are in `app/schemas/`
|
||||||
|
- Database configuration is in `app/db/`
|
||||||
|
|
||||||
|
## Health Check
|
||||||
|
|
||||||
|
The `/health` endpoint provides information about the application and database status:
|
||||||
|
- Returns service status
|
||||||
|
- Checks database connectivity
|
||||||
|
- Provides error information if issues are detected
|
||||||
|
41
alembic.ini
Normal file
41
alembic.ini
Normal file
@ -0,0 +1,41 @@
|
|||||||
|
[alembic]
|
||||||
|
script_location = alembic
|
||||||
|
prepend_sys_path = .
|
||||||
|
version_path_separator = os
|
||||||
|
sqlalchemy.url = sqlite:////app/storage/db/db.sqlite
|
||||||
|
|
||||||
|
[post_write_hooks]
|
||||||
|
|
||||||
|
[loggers]
|
||||||
|
keys = root,sqlalchemy,alembic
|
||||||
|
|
||||||
|
[handlers]
|
||||||
|
keys = console
|
||||||
|
|
||||||
|
[formatters]
|
||||||
|
keys = generic
|
||||||
|
|
||||||
|
[logger_root]
|
||||||
|
level = WARN
|
||||||
|
handlers = console
|
||||||
|
qualname =
|
||||||
|
|
||||||
|
[logger_sqlalchemy]
|
||||||
|
level = WARN
|
||||||
|
handlers =
|
||||||
|
qualname = sqlalchemy.engine
|
||||||
|
|
||||||
|
[logger_alembic]
|
||||||
|
level = INFO
|
||||||
|
handlers =
|
||||||
|
qualname = alembic
|
||||||
|
|
||||||
|
[handler_console]
|
||||||
|
class = StreamHandler
|
||||||
|
args = (sys.stderr,)
|
||||||
|
level = NOTSET
|
||||||
|
formatter = generic
|
||||||
|
|
||||||
|
[formatter_generic]
|
||||||
|
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||||
|
datefmt = %H:%M:%S
|
83
alembic/env.py
Normal file
83
alembic/env.py
Normal file
@ -0,0 +1,83 @@
|
|||||||
|
import sys
|
||||||
|
import os
|
||||||
|
from logging.config import fileConfig
|
||||||
|
from sqlalchemy import engine_from_config
|
||||||
|
from sqlalchemy import pool
|
||||||
|
from alembic import context
|
||||||
|
|
||||||
|
# Add the project directory to Python path
|
||||||
|
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..'))
|
||||||
|
|
||||||
|
# Import the Base from your models
|
||||||
|
from app.db.base import Base
|
||||||
|
from app.models.task import Task
|
||||||
|
|
||||||
|
# this is the Alembic Config object, which provides
|
||||||
|
# access to the values within the .ini file in use.
|
||||||
|
config = context.config
|
||||||
|
|
||||||
|
# Interpret the config file for Python logging.
|
||||||
|
# This line sets up loggers basically.
|
||||||
|
if config.config_file_name is not None:
|
||||||
|
fileConfig(config.config_file_name)
|
||||||
|
|
||||||
|
# add your model's MetaData object here
|
||||||
|
# for 'autogenerate' support
|
||||||
|
target_metadata = Base.metadata
|
||||||
|
|
||||||
|
# other values from the config, defined by the needs of env.py,
|
||||||
|
# can be acquired:
|
||||||
|
# my_important_option = config.get_main_option("my_important_option")
|
||||||
|
# ... etc.
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_offline() -> None:
|
||||||
|
"""Run migrations in 'offline' mode.
|
||||||
|
|
||||||
|
This configures the context with just a URL
|
||||||
|
and not an Engine, though an Engine is acceptable
|
||||||
|
here as well. By skipping the Engine creation
|
||||||
|
we don't even need a DBAPI to be available.
|
||||||
|
|
||||||
|
Calls to context.execute() here emit the given string to the
|
||||||
|
script output.
|
||||||
|
|
||||||
|
"""
|
||||||
|
url = config.get_main_option("sqlalchemy.url")
|
||||||
|
context.configure(
|
||||||
|
url=url,
|
||||||
|
target_metadata=target_metadata,
|
||||||
|
literal_binds=True,
|
||||||
|
dialect_opts={"paramstyle": "named"},
|
||||||
|
)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_online() -> None:
|
||||||
|
"""Run migrations in 'online' mode.
|
||||||
|
|
||||||
|
In this scenario we need to create an Engine
|
||||||
|
and associate a connection with the context.
|
||||||
|
|
||||||
|
"""
|
||||||
|
connectable = engine_from_config(
|
||||||
|
config.get_section(config.config_ini_section, {}),
|
||||||
|
prefix="sqlalchemy.",
|
||||||
|
poolclass=pool.NullPool,
|
||||||
|
)
|
||||||
|
|
||||||
|
with connectable.connect() as connection:
|
||||||
|
context.configure(
|
||||||
|
connection=connection, target_metadata=target_metadata
|
||||||
|
)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
if context.is_offline_mode():
|
||||||
|
run_migrations_offline()
|
||||||
|
else:
|
||||||
|
run_migrations_online()
|
24
alembic/script.py.mako
Normal file
24
alembic/script.py.mako
Normal file
@ -0,0 +1,24 @@
|
|||||||
|
"""${message}
|
||||||
|
|
||||||
|
Revision ID: ${up_revision}
|
||||||
|
Revises: ${down_revision | comma,n}
|
||||||
|
Create Date: ${create_date}
|
||||||
|
|
||||||
|
"""
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
${imports if imports else ""}
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = ${repr(up_revision)}
|
||||||
|
down_revision = ${repr(down_revision)}
|
||||||
|
branch_labels = ${repr(branch_labels)}
|
||||||
|
depends_on = ${repr(depends_on)}
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
${upgrades if upgrades else "pass"}
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
${downgrades if downgrades else "pass"}
|
34
alembic/versions/001_initial_migration.py
Normal file
34
alembic/versions/001_initial_migration.py
Normal file
@ -0,0 +1,34 @@
|
|||||||
|
"""Initial migration - Create tasks table
|
||||||
|
|
||||||
|
Revision ID: 001
|
||||||
|
Revises:
|
||||||
|
Create Date: 2024-01-01 12:00:00.000000
|
||||||
|
|
||||||
|
"""
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '001'
|
||||||
|
down_revision = None
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
op.create_table('tasks',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('title', sa.String(length=200), nullable=False),
|
||||||
|
sa.Column('description', sa.Text(), nullable=True),
|
||||||
|
sa.Column('completed', sa.Boolean(), nullable=False),
|
||||||
|
sa.Column('priority', sa.String(length=10), nullable=False),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('(CURRENT_TIMESTAMP)'), nullable=True),
|
||||||
|
sa.Column('updated_at', sa.DateTime(timezone=True), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_tasks_id'), 'tasks', ['id'], unique=False)
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
op.drop_index(op.f('ix_tasks_id'), table_name='tasks')
|
||||||
|
op.drop_table('tasks')
|
0
app/__init__.py
Normal file
0
app/__init__.py
Normal file
0
app/db/__init__.py
Normal file
0
app/db/__init__.py
Normal file
3
app/db/base.py
Normal file
3
app/db/base.py
Normal file
@ -0,0 +1,3 @@
|
|||||||
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
|
|
||||||
|
Base = declarative_base()
|
23
app/db/session.py
Normal file
23
app/db/session.py
Normal file
@ -0,0 +1,23 @@
|
|||||||
|
from pathlib import Path
|
||||||
|
from sqlalchemy import create_engine
|
||||||
|
from sqlalchemy.orm import sessionmaker
|
||||||
|
|
||||||
|
DB_DIR = Path("/app") / "storage" / "db"
|
||||||
|
DB_DIR.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
SQLALCHEMY_DATABASE_URL = f"sqlite:///{DB_DIR}/db.sqlite"
|
||||||
|
|
||||||
|
engine = create_engine(
|
||||||
|
SQLALCHEMY_DATABASE_URL,
|
||||||
|
connect_args={"check_same_thread": False}
|
||||||
|
)
|
||||||
|
|
||||||
|
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||||
|
|
||||||
|
|
||||||
|
def get_db():
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
yield db
|
||||||
|
finally:
|
||||||
|
db.close()
|
0
app/models/__init__.py
Normal file
0
app/models/__init__.py
Normal file
15
app/models/task.py
Normal file
15
app/models/task.py
Normal file
@ -0,0 +1,15 @@
|
|||||||
|
from sqlalchemy import Column, Integer, String, Text, DateTime, Boolean
|
||||||
|
from sqlalchemy.sql import func
|
||||||
|
from app.db.base import Base
|
||||||
|
|
||||||
|
|
||||||
|
class Task(Base):
|
||||||
|
__tablename__ = "tasks"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
title = Column(String(200), nullable=False)
|
||||||
|
description = Column(Text, nullable=True)
|
||||||
|
completed = Column(Boolean, default=False, nullable=False)
|
||||||
|
priority = Column(String(10), default="medium", nullable=False)
|
||||||
|
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
||||||
|
updated_at = Column(DateTime(timezone=True), onupdate=func.now())
|
0
app/routers/__init__.py
Normal file
0
app/routers/__init__.py
Normal file
58
app/routers/tasks.py
Normal file
58
app/routers/tasks.py
Normal file
@ -0,0 +1,58 @@
|
|||||||
|
from typing import List
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
|
||||||
|
from app.db.session import get_db
|
||||||
|
from app.models.task import Task
|
||||||
|
from app.schemas.task import Task as TaskSchema, TaskCreate, TaskUpdate
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/", response_model=TaskSchema)
|
||||||
|
def create_task(task: TaskCreate, db: Session = Depends(get_db)):
|
||||||
|
db_task = Task(**task.dict())
|
||||||
|
db.add(db_task)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(db_task)
|
||||||
|
return db_task
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/", response_model=List[TaskSchema])
|
||||||
|
def read_tasks(skip: int = 0, limit: int = 100, db: Session = Depends(get_db)):
|
||||||
|
tasks = db.query(Task).offset(skip).limit(limit).all()
|
||||||
|
return tasks
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{task_id}", response_model=TaskSchema)
|
||||||
|
def read_task(task_id: int, db: Session = Depends(get_db)):
|
||||||
|
task = db.query(Task).filter(Task.id == task_id).first()
|
||||||
|
if task is None:
|
||||||
|
raise HTTPException(status_code=404, detail="Task not found")
|
||||||
|
return task
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{task_id}", response_model=TaskSchema)
|
||||||
|
def update_task(task_id: int, task_update: TaskUpdate, db: Session = Depends(get_db)):
|
||||||
|
task = db.query(Task).filter(Task.id == task_id).first()
|
||||||
|
if task is None:
|
||||||
|
raise HTTPException(status_code=404, detail="Task not found")
|
||||||
|
|
||||||
|
update_data = task_update.dict(exclude_unset=True)
|
||||||
|
for field, value in update_data.items():
|
||||||
|
setattr(task, field, value)
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
db.refresh(task)
|
||||||
|
return task
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{task_id}")
|
||||||
|
def delete_task(task_id: int, db: Session = Depends(get_db)):
|
||||||
|
task = db.query(Task).filter(Task.id == task_id).first()
|
||||||
|
if task is None:
|
||||||
|
raise HTTPException(status_code=404, detail="Task not found")
|
||||||
|
|
||||||
|
db.delete(task)
|
||||||
|
db.commit()
|
||||||
|
return {"message": "Task deleted successfully"}
|
0
app/schemas/__init__.py
Normal file
0
app/schemas/__init__.py
Normal file
30
app/schemas/task.py
Normal file
30
app/schemas/task.py
Normal file
@ -0,0 +1,30 @@
|
|||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional
|
||||||
|
from pydantic import BaseModel
|
||||||
|
|
||||||
|
|
||||||
|
class TaskBase(BaseModel):
|
||||||
|
title: str
|
||||||
|
description: Optional[str] = None
|
||||||
|
completed: bool = False
|
||||||
|
priority: str = "medium"
|
||||||
|
|
||||||
|
|
||||||
|
class TaskCreate(TaskBase):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class TaskUpdate(BaseModel):
|
||||||
|
title: Optional[str] = None
|
||||||
|
description: Optional[str] = None
|
||||||
|
completed: Optional[bool] = None
|
||||||
|
priority: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
class Task(TaskBase):
|
||||||
|
id: int
|
||||||
|
created_at: datetime
|
||||||
|
updated_at: Optional[datetime] = None
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
55
main.py
Normal file
55
main.py
Normal file
@ -0,0 +1,55 @@
|
|||||||
|
from fastapi import FastAPI, Depends
|
||||||
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy import text
|
||||||
|
|
||||||
|
from app.db.session import get_db, engine
|
||||||
|
from app.db.base import Base
|
||||||
|
from app.routers import tasks
|
||||||
|
|
||||||
|
app = FastAPI(
|
||||||
|
title="Task Management Tool",
|
||||||
|
description="A FastAPI-based task management system",
|
||||||
|
version="1.0.0",
|
||||||
|
openapi_url="/openapi.json"
|
||||||
|
)
|
||||||
|
|
||||||
|
app.add_middleware(
|
||||||
|
CORSMiddleware,
|
||||||
|
allow_origins=["*"],
|
||||||
|
allow_credentials=True,
|
||||||
|
allow_methods=["*"],
|
||||||
|
allow_headers=["*"],
|
||||||
|
)
|
||||||
|
|
||||||
|
Base.metadata.create_all(bind=engine)
|
||||||
|
|
||||||
|
app.include_router(tasks.router, prefix="/tasks", tags=["tasks"])
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/")
|
||||||
|
async def root():
|
||||||
|
return {
|
||||||
|
"title": "Task Management Tool",
|
||||||
|
"description": "A FastAPI-based task management system",
|
||||||
|
"documentation": "/docs",
|
||||||
|
"health_check": "/health"
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/health")
|
||||||
|
async def health_check(db: Session = Depends(get_db)):
|
||||||
|
try:
|
||||||
|
db.execute(text("SELECT 1"))
|
||||||
|
return {
|
||||||
|
"status": "healthy",
|
||||||
|
"database": "connected",
|
||||||
|
"service": "task-management-tool"
|
||||||
|
}
|
||||||
|
except Exception as e:
|
||||||
|
return {
|
||||||
|
"status": "unhealthy",
|
||||||
|
"database": "disconnected",
|
||||||
|
"service": "task-management-tool",
|
||||||
|
"error": str(e)
|
||||||
|
}
|
6
requirements.txt
Normal file
6
requirements.txt
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
fastapi==0.104.1
|
||||||
|
uvicorn==0.24.0
|
||||||
|
sqlalchemy==2.0.23
|
||||||
|
alembic==1.12.1
|
||||||
|
pydantic==2.5.0
|
||||||
|
ruff==0.1.6
|
Loading…
x
Reference in New Issue
Block a user