Implement Task Management API with FastAPI and SQLite
- Set up project structure and dependencies - Create database models and schemas for tasks - Implement CRUD operations for tasks - Add API endpoints for task management - Create Alembic migrations for the database - Add health check endpoint - Implement error handling - Add documentation in README.md
This commit is contained in:
parent
ac0d5a3064
commit
29d75cdf08
106
README.md
106
README.md
@ -1,3 +1,105 @@
|
||||
# FastAPI Application
|
||||
# Task Management API
|
||||
|
||||
This is a FastAPI application bootstrapped by BackendIM, the AI-powered backend generation platform.
|
||||
A simple task management API built with FastAPI and SQLite.
|
||||
|
||||
## Features
|
||||
|
||||
- Create, read, update, and delete tasks
|
||||
- Filter tasks by status
|
||||
- Mark tasks as completed
|
||||
- Set task priorities
|
||||
- Health check endpoint
|
||||
|
||||
## Tech Stack
|
||||
|
||||
- **FastAPI**: Modern, fast web framework for building APIs
|
||||
- **SQLAlchemy**: SQL toolkit and ORM
|
||||
- **Alembic**: Database migration tool
|
||||
- **SQLite**: Lightweight, file-based database
|
||||
- **Pydantic**: Data validation and settings management
|
||||
- **Uvicorn**: ASGI server for FastAPI
|
||||
|
||||
## Setup and Installation
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Python 3.9+
|
||||
|
||||
### Installation
|
||||
|
||||
1. Clone the repository:
|
||||
|
||||
```bash
|
||||
git clone <repository-url>
|
||||
cd <repository-directory>
|
||||
```
|
||||
|
||||
2. Create a virtual environment and activate it:
|
||||
|
||||
```bash
|
||||
python -m venv venv
|
||||
source venv/bin/activate # On Windows, use: venv\Scripts\activate
|
||||
```
|
||||
|
||||
3. Install dependencies:
|
||||
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
4. Run database migrations:
|
||||
|
||||
```bash
|
||||
alembic upgrade head
|
||||
```
|
||||
|
||||
### Running the Application
|
||||
|
||||
Start the application with Uvicorn:
|
||||
|
||||
```bash
|
||||
uvicorn main:app --host 0.0.0.0 --port 8000 --reload
|
||||
```
|
||||
|
||||
The API will be available at http://localhost:8000.
|
||||
|
||||
## API Documentation
|
||||
|
||||
Once the application is running, you can access:
|
||||
|
||||
- Interactive API documentation: http://localhost:8000/docs
|
||||
- Alternative API documentation: http://localhost:8000/redoc
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### Health Check
|
||||
|
||||
- `GET /api/v1/health`: Check API and database health
|
||||
|
||||
### Tasks
|
||||
|
||||
- `GET /api/v1/tasks`: Get all tasks
|
||||
- `POST /api/v1/tasks`: Create a new task
|
||||
- `GET /api/v1/tasks/{id}`: Get a specific task
|
||||
- `PUT /api/v1/tasks/{id}`: Update a task
|
||||
- `DELETE /api/v1/tasks/{id}`: Delete (soft delete) a task
|
||||
|
||||
## Task Structure
|
||||
|
||||
A task has the following structure:
|
||||
|
||||
```json
|
||||
{
|
||||
"id": 1,
|
||||
"title": "Sample Task",
|
||||
"description": "This is a sample task",
|
||||
"status": "todo",
|
||||
"priority": "medium",
|
||||
"due_date": "2023-12-31T23:59:59",
|
||||
"created_at": "2023-09-27T10:00:00",
|
||||
"updated_at": "2023-09-27T10:00:00"
|
||||
}
|
||||
```
|
||||
|
||||
- `status` can be: "todo", "in_progress", or "done"
|
||||
- `priority` can be: "low", "medium", or "high"
|
||||
|
85
alembic.ini
Normal file
85
alembic.ini
Normal file
@ -0,0 +1,85 @@
|
||||
# A generic, single database configuration.
|
||||
|
||||
[alembic]
|
||||
# path to migration scripts
|
||||
script_location = migrations
|
||||
|
||||
# template used to generate migration files
|
||||
# file_template = %%(rev)s_%%(slug)s
|
||||
|
||||
# timezone to use when rendering the date
|
||||
# within the migration file as well as the filename.
|
||||
# string value is passed to dateutil.tz.gettz()
|
||||
# leave blank for localtime
|
||||
# timezone =
|
||||
|
||||
# max length of characters to apply to the
|
||||
# "slug" field
|
||||
# truncate_slug_length = 40
|
||||
|
||||
# set to 'true' to run the environment during
|
||||
# the 'revision' command, regardless of autogenerate
|
||||
# revision_environment = false
|
||||
|
||||
# set to 'true' to allow .pyc and .pyo files without
|
||||
# a source .py file to be detected as revisions in the
|
||||
# versions/ directory
|
||||
# sourceless = false
|
||||
|
||||
# version location specification; this defaults
|
||||
# to migrations/versions. When using multiple version
|
||||
# directories, initial revisions must be specified with --version-path
|
||||
# version_locations = %(here)s/bar %(here)s/bat migrations/versions
|
||||
|
||||
# the output encoding used when revision files
|
||||
# are written from script.py.mako
|
||||
# output_encoding = utf-8
|
||||
|
||||
# SQLite URL with absolute path
|
||||
sqlalchemy.url = sqlite:////app/storage/db/db.sqlite
|
||||
|
||||
[post_write_hooks]
|
||||
# post_write_hooks defines scripts or Python functions that are run
|
||||
# on newly generated revision scripts. See the documentation for further
|
||||
# detail and examples
|
||||
|
||||
# format using "black" - use the console_scripts runner, against the "black" entrypoint
|
||||
# hooks=black
|
||||
# black.type=console_scripts
|
||||
# black.entrypoint=black
|
||||
# black.options=-l 79
|
||||
|
||||
# Logging configuration
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
[handlers]
|
||||
keys = console
|
||||
|
||||
[formatters]
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARN
|
||||
handlers = console
|
||||
qualname =
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
[handler_console]
|
||||
class = StreamHandler
|
||||
args = (sys.stderr,)
|
||||
level = NOTSET
|
||||
formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
datefmt = %H:%M:%S
|
0
app/api/__init__.py
Normal file
0
app/api/__init__.py
Normal file
0
app/api/api_v1/__init__.py
Normal file
0
app/api/api_v1/__init__.py
Normal file
7
app/api/api_v1/api.py
Normal file
7
app/api/api_v1/api.py
Normal file
@ -0,0 +1,7 @@
|
||||
from fastapi import APIRouter
|
||||
|
||||
from app.api.api_v1.endpoints import health, tasks
|
||||
|
||||
api_router = APIRouter()
|
||||
api_router.include_router(tasks.router, prefix="/tasks", tags=["tasks"])
|
||||
api_router.include_router(health.router, prefix="/health", tags=["health"])
|
0
app/api/api_v1/endpoints/__init__.py
Normal file
0
app/api/api_v1/endpoints/__init__.py
Normal file
31
app/api/api_v1/endpoints/health.py
Normal file
31
app/api/api_v1/endpoints/health.py
Normal file
@ -0,0 +1,31 @@
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from app.core.database import get_db
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
@router.get("/", tags=["Health"])
|
||||
async def health_check(db: Session = Depends(get_db)):
|
||||
"""
|
||||
Health check endpoint.
|
||||
|
||||
Verifies:
|
||||
- API is running
|
||||
- Database connection is working
|
||||
"""
|
||||
try:
|
||||
# Test database connection
|
||||
db.execute("SELECT 1")
|
||||
|
||||
return {
|
||||
"status": "healthy",
|
||||
"api": "up",
|
||||
"database": "up",
|
||||
}
|
||||
except Exception as e:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
|
||||
detail=f"Service unhealthy: {str(e)}",
|
||||
) from e
|
114
app/api/api_v1/endpoints/tasks.py
Normal file
114
app/api/api_v1/endpoints/tasks.py
Normal file
@ -0,0 +1,114 @@
|
||||
from typing import List, Optional
|
||||
|
||||
from fastapi import APIRouter, Depends, status
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from app.core.database import get_db
|
||||
from app.core.exceptions import InvalidTaskData, TaskNotFound
|
||||
from app.crud.task import task
|
||||
from app.models.task import TaskStatus
|
||||
from app.schemas.task import Task, TaskCreate, TaskUpdate
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
@router.get("/", response_model=List[Task])
|
||||
def read_tasks(
|
||||
db: Session = Depends(get_db),
|
||||
skip: int = 0,
|
||||
limit: int = 100,
|
||||
status: Optional[TaskStatus] = None,
|
||||
):
|
||||
"""
|
||||
Retrieve tasks.
|
||||
|
||||
- **skip**: Number of tasks to skip
|
||||
- **limit**: Maximum number of tasks to return
|
||||
- **status**: Filter tasks by status
|
||||
"""
|
||||
if status:
|
||||
return task.get_multi_by_status(db, status=status, skip=skip, limit=limit)
|
||||
return task.get_multi(db, skip=skip, limit=limit)
|
||||
|
||||
|
||||
@router.post("/", response_model=Task, status_code=status.HTTP_201_CREATED)
|
||||
def create_task(
|
||||
*,
|
||||
db: Session = Depends(get_db),
|
||||
task_in: TaskCreate,
|
||||
):
|
||||
"""
|
||||
Create new task.
|
||||
|
||||
- **title**: Title of the task (required)
|
||||
- **description**: Description of the task (optional)
|
||||
- **status**: Status of the task (default: todo)
|
||||
- **priority**: Priority of the task (default: medium)
|
||||
- **due_date**: Due date of the task (optional)
|
||||
"""
|
||||
try:
|
||||
return task.create(db=db, obj_in=task_in)
|
||||
except Exception as e:
|
||||
raise InvalidTaskData(detail=f"Could not create task: {str(e)}") from e
|
||||
|
||||
|
||||
@router.get("/{id}", response_model=Task)
|
||||
def read_task(
|
||||
*,
|
||||
db: Session = Depends(get_db),
|
||||
id: int,
|
||||
):
|
||||
"""
|
||||
Get task by ID.
|
||||
|
||||
- **id**: ID of the task
|
||||
"""
|
||||
task_obj = task.get(db=db, id=id)
|
||||
if not task_obj:
|
||||
raise TaskNotFound(task_id=id)
|
||||
return task_obj
|
||||
|
||||
|
||||
@router.put("/{id}", response_model=Task)
|
||||
def update_task(
|
||||
*,
|
||||
db: Session = Depends(get_db),
|
||||
id: int,
|
||||
task_in: TaskUpdate,
|
||||
):
|
||||
"""
|
||||
Update a task.
|
||||
|
||||
- **id**: ID of the task
|
||||
- **title**: New title (optional)
|
||||
- **description**: New description (optional)
|
||||
- **status**: New status (optional)
|
||||
- **priority**: New priority (optional)
|
||||
- **due_date**: New due date (optional)
|
||||
"""
|
||||
task_obj = task.get(db=db, id=id)
|
||||
if not task_obj:
|
||||
raise TaskNotFound(task_id=id)
|
||||
|
||||
try:
|
||||
return task.update(db=db, db_obj=task_obj, obj_in=task_in)
|
||||
except Exception as e:
|
||||
raise InvalidTaskData(detail=f"Could not update task: {str(e)}") from e
|
||||
|
||||
|
||||
@router.delete("/{id}", response_model=Task)
|
||||
def delete_task(
|
||||
*,
|
||||
db: Session = Depends(get_db),
|
||||
id: int,
|
||||
):
|
||||
"""
|
||||
Delete a task.
|
||||
|
||||
- **id**: ID of the task
|
||||
"""
|
||||
task_obj = task.get(db=db, id=id)
|
||||
if not task_obj:
|
||||
raise TaskNotFound(task_id=id)
|
||||
|
||||
return task.remove(db=db, id=id)
|
33
app/core/config.py
Normal file
33
app/core/config.py
Normal file
@ -0,0 +1,33 @@
|
||||
from pathlib import Path
|
||||
from typing import List, Union
|
||||
|
||||
from pydantic import AnyHttpUrl, validator
|
||||
from pydantic_settings import BaseSettings
|
||||
|
||||
|
||||
class Settings(BaseSettings):
|
||||
API_V1_STR: str = "/api/v1"
|
||||
PROJECT_NAME: str = "Task Management API"
|
||||
|
||||
# CORS Configuration
|
||||
BACKEND_CORS_ORIGINS: List[Union[str, AnyHttpUrl]] = []
|
||||
|
||||
@validator("BACKEND_CORS_ORIGINS", pre=True)
|
||||
def assemble_cors_origins(cls, v: Union[str, List[str]]) -> Union[List[str], str]:
|
||||
if isinstance(v, str) and not v.startswith("["):
|
||||
return [i.strip() for i in v.split(",")]
|
||||
elif isinstance(v, (list, str)):
|
||||
return v
|
||||
raise ValueError(v)
|
||||
|
||||
# Database Configuration
|
||||
DB_DIR: Path = Path("/app") / "storage" / "db"
|
||||
|
||||
class Config:
|
||||
case_sensitive = True
|
||||
|
||||
|
||||
settings = Settings()
|
||||
|
||||
# Create database directory
|
||||
settings.DB_DIR.mkdir(parents=True, exist_ok=True)
|
25
app/core/database.py
Normal file
25
app/core/database.py
Normal file
@ -0,0 +1,25 @@
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.ext.declarative import declarative_base
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
|
||||
from app.core.config import settings
|
||||
|
||||
SQLALCHEMY_DATABASE_URL = f"sqlite:///{settings.DB_DIR}/db.sqlite"
|
||||
|
||||
engine = create_engine(
|
||||
SQLALCHEMY_DATABASE_URL,
|
||||
connect_args={"check_same_thread": False}
|
||||
)
|
||||
|
||||
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||
|
||||
Base = declarative_base()
|
||||
|
||||
|
||||
def get_db():
|
||||
"""Get SQLAlchemy database session."""
|
||||
db = SessionLocal()
|
||||
try:
|
||||
yield db
|
||||
finally:
|
||||
db.close()
|
33
app/core/error_handlers.py
Normal file
33
app/core/error_handlers.py
Normal file
@ -0,0 +1,33 @@
|
||||
from fastapi import FastAPI, Request, status
|
||||
from fastapi.responses import JSONResponse
|
||||
from sqlalchemy.exc import SQLAlchemyError
|
||||
|
||||
from app.core.exceptions import TaskException
|
||||
|
||||
|
||||
def add_exception_handlers(app: FastAPI) -> None:
|
||||
"""Add global exception handlers to the FastAPI app."""
|
||||
|
||||
@app.exception_handler(TaskException)
|
||||
async def task_exception_handler(request: Request, exc: TaskException):
|
||||
"""Handle Task API exceptions."""
|
||||
return JSONResponse(
|
||||
status_code=exc.status_code,
|
||||
content={"detail": exc.detail},
|
||||
)
|
||||
|
||||
@app.exception_handler(SQLAlchemyError)
|
||||
async def sqlalchemy_exception_handler(request: Request, exc: SQLAlchemyError):
|
||||
"""Handle SQLAlchemy exceptions."""
|
||||
return JSONResponse(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
content={"detail": "Database error occurred"},
|
||||
)
|
||||
|
||||
@app.exception_handler(Exception)
|
||||
async def general_exception_handler(request: Request, exc: Exception):
|
||||
"""Handle general exceptions."""
|
||||
return JSONResponse(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
content={"detail": "An unexpected error occurred"},
|
||||
)
|
45
app/core/exceptions.py
Normal file
45
app/core/exceptions.py
Normal file
@ -0,0 +1,45 @@
|
||||
from typing import Any, Dict, Optional
|
||||
|
||||
from fastapi import HTTPException, status
|
||||
|
||||
|
||||
class TaskException(HTTPException):
|
||||
"""Base Task API exception."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
status_code: int,
|
||||
detail: Any = None,
|
||||
headers: Optional[Dict[str, Any]] = None,
|
||||
) -> None:
|
||||
super().__init__(status_code=status_code, detail=detail, headers=headers)
|
||||
|
||||
|
||||
class TaskNotFound(TaskException):
|
||||
"""Exception raised when a task is not found."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
task_id: int,
|
||||
headers: Optional[Dict[str, Any]] = None,
|
||||
) -> None:
|
||||
super().__init__(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Task with ID {task_id} not found",
|
||||
headers=headers,
|
||||
)
|
||||
|
||||
|
||||
class InvalidTaskData(TaskException):
|
||||
"""Exception raised when task data is invalid."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
detail: str = "Invalid task data",
|
||||
headers: Optional[Dict[str, Any]] = None,
|
||||
) -> None:
|
||||
super().__init__(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail=detail,
|
||||
headers=headers,
|
||||
)
|
0
app/crud/__init__.py
Normal file
0
app/crud/__init__.py
Normal file
112
app/crud/base.py
Normal file
112
app/crud/base.py
Normal file
@ -0,0 +1,112 @@
|
||||
from typing import Any, Dict, Generic, List, Optional, Type, TypeVar, Union
|
||||
|
||||
from fastapi.encoders import jsonable_encoder
|
||||
from pydantic import BaseModel
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from app.core.database import Base
|
||||
|
||||
ModelType = TypeVar("ModelType", bound=Base)
|
||||
CreateSchemaType = TypeVar("CreateSchemaType", bound=BaseModel)
|
||||
UpdateSchemaType = TypeVar("UpdateSchemaType", bound=BaseModel)
|
||||
|
||||
|
||||
class CRUDBase(Generic[ModelType, CreateSchemaType, UpdateSchemaType]):
|
||||
"""Base class for CRUD operations."""
|
||||
|
||||
def __init__(self, model: Type[ModelType]):
|
||||
"""Initialize with sqlalchemy model.
|
||||
|
||||
Args:
|
||||
model: The SQLAlchemy model
|
||||
"""
|
||||
self.model = model
|
||||
|
||||
def get(self, db: Session, id: Any) -> Optional[ModelType]:
|
||||
"""Get a record by ID.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
id: ID of the record to get
|
||||
|
||||
Returns:
|
||||
The record if found, None otherwise
|
||||
"""
|
||||
return db.query(self.model).filter(self.model.id == id).first()
|
||||
|
||||
def get_multi(
|
||||
self, db: Session, *, skip: int = 0, limit: int = 100
|
||||
) -> List[ModelType]:
|
||||
"""Get multiple records.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
skip: Number of records to skip
|
||||
limit: Maximum number of records to return
|
||||
|
||||
Returns:
|
||||
List of records
|
||||
"""
|
||||
return db.query(self.model).offset(skip).limit(limit).all()
|
||||
|
||||
def create(self, db: Session, *, obj_in: CreateSchemaType) -> ModelType:
|
||||
"""Create a new record.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
obj_in: Schema with data to create
|
||||
|
||||
Returns:
|
||||
The created record
|
||||
"""
|
||||
obj_in_data = jsonable_encoder(obj_in)
|
||||
db_obj = self.model(**obj_in_data)
|
||||
db.add(db_obj)
|
||||
db.commit()
|
||||
db.refresh(db_obj)
|
||||
return db_obj
|
||||
|
||||
def update(
|
||||
self,
|
||||
db: Session,
|
||||
*,
|
||||
db_obj: ModelType,
|
||||
obj_in: Union[UpdateSchemaType, Dict[str, Any]]
|
||||
) -> ModelType:
|
||||
"""Update a record.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
db_obj: The database object to update
|
||||
obj_in: Schema with data to update
|
||||
|
||||
Returns:
|
||||
The updated record
|
||||
"""
|
||||
obj_data = jsonable_encoder(db_obj)
|
||||
if isinstance(obj_in, dict):
|
||||
update_data = obj_in
|
||||
else:
|
||||
update_data = obj_in.model_dump(exclude_unset=True)
|
||||
for field in obj_data:
|
||||
if field in update_data:
|
||||
setattr(db_obj, field, update_data[field])
|
||||
db.add(db_obj)
|
||||
db.commit()
|
||||
db.refresh(db_obj)
|
||||
return db_obj
|
||||
|
||||
def remove(self, db: Session, *, id: int) -> ModelType:
|
||||
"""Remove a record.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
id: ID of the record to remove
|
||||
|
||||
Returns:
|
||||
The removed record
|
||||
"""
|
||||
obj = db.query(self.model).get(id)
|
||||
db.delete(obj)
|
||||
db.commit()
|
||||
return obj
|
91
app/crud/task.py
Normal file
91
app/crud/task.py
Normal file
@ -0,0 +1,91 @@
|
||||
from typing import List, Optional
|
||||
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from app.crud.base import CRUDBase
|
||||
from app.models.task import Task, TaskStatus
|
||||
from app.schemas.task import TaskCreate, TaskUpdate
|
||||
|
||||
|
||||
class CRUDTask(CRUDBase[Task, TaskCreate, TaskUpdate]):
|
||||
"""CRUD operations for tasks."""
|
||||
|
||||
def get_multi_by_status(
|
||||
self, db: Session, *, status: TaskStatus, skip: int = 0, limit: int = 100
|
||||
) -> List[Task]:
|
||||
"""Get multiple tasks by status.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
status: Task status
|
||||
skip: Number of records to skip
|
||||
limit: Maximum number of records to return
|
||||
|
||||
Returns:
|
||||
List of tasks with specified status
|
||||
"""
|
||||
return (
|
||||
db.query(self.model)
|
||||
.filter(self.model.status == status, not self.model.is_deleted)
|
||||
.offset(skip)
|
||||
.limit(limit)
|
||||
.all()
|
||||
)
|
||||
|
||||
def get_multi(
|
||||
self, db: Session, *, skip: int = 0, limit: int = 100
|
||||
) -> List[Task]:
|
||||
"""Get multiple tasks, excluding deleted ones.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
skip: Number of records to skip
|
||||
limit: Maximum number of records to return
|
||||
|
||||
Returns:
|
||||
List of non-deleted tasks
|
||||
"""
|
||||
return (
|
||||
db.query(self.model)
|
||||
.filter(not self.model.is_deleted)
|
||||
.offset(skip)
|
||||
.limit(limit)
|
||||
.all()
|
||||
)
|
||||
|
||||
def get(self, db: Session, id: int) -> Optional[Task]:
|
||||
"""Get a task by ID, excluding deleted ones.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
id: Task ID
|
||||
|
||||
Returns:
|
||||
The task if found and not deleted, None otherwise
|
||||
"""
|
||||
return (
|
||||
db.query(self.model)
|
||||
.filter(self.model.id == id, not self.model.is_deleted)
|
||||
.first()
|
||||
)
|
||||
|
||||
def remove(self, db: Session, *, id: int) -> Task:
|
||||
"""Soft delete a task.
|
||||
|
||||
Args:
|
||||
db: Database session
|
||||
id: Task ID
|
||||
|
||||
Returns:
|
||||
The soft deleted task
|
||||
"""
|
||||
obj = db.query(self.model).get(id)
|
||||
if obj:
|
||||
obj.is_deleted = True
|
||||
db.add(obj)
|
||||
db.commit()
|
||||
db.refresh(obj)
|
||||
return obj
|
||||
|
||||
|
||||
task = CRUDTask(Task)
|
0
app/models/__init__.py
Normal file
0
app/models/__init__.py
Normal file
34
app/models/task.py
Normal file
34
app/models/task.py
Normal file
@ -0,0 +1,34 @@
|
||||
from datetime import datetime
|
||||
from enum import Enum as PyEnum
|
||||
|
||||
from sqlalchemy import Boolean, Column, DateTime, Enum, Integer, String, Text
|
||||
|
||||
from app.core.database import Base
|
||||
|
||||
|
||||
class TaskStatus(str, PyEnum):
|
||||
TODO = "todo"
|
||||
IN_PROGRESS = "in_progress"
|
||||
DONE = "done"
|
||||
|
||||
|
||||
class TaskPriority(str, PyEnum):
|
||||
LOW = "low"
|
||||
MEDIUM = "medium"
|
||||
HIGH = "high"
|
||||
|
||||
|
||||
class Task(Base):
|
||||
"""Task model."""
|
||||
|
||||
__tablename__ = "tasks"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
title = Column(String(255), nullable=False)
|
||||
description = Column(Text, nullable=True)
|
||||
status = Column(Enum(TaskStatus), default=TaskStatus.TODO)
|
||||
priority = Column(Enum(TaskPriority), default=TaskPriority.MEDIUM)
|
||||
due_date = Column(DateTime, nullable=True)
|
||||
is_deleted = Column(Boolean, default=False)
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
0
app/schemas/__init__.py
Normal file
0
app/schemas/__init__.py
Normal file
49
app/schemas/task.py
Normal file
49
app/schemas/task.py
Normal file
@ -0,0 +1,49 @@
|
||||
from datetime import datetime
|
||||
from typing import Optional
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
from app.models.task import TaskPriority, TaskStatus
|
||||
|
||||
|
||||
class TaskBase(BaseModel):
|
||||
"""Base schema for Task."""
|
||||
|
||||
title: str = Field(..., min_length=1, max_length=255)
|
||||
description: Optional[str] = None
|
||||
status: TaskStatus = TaskStatus.TODO
|
||||
priority: TaskPriority = TaskPriority.MEDIUM
|
||||
due_date: Optional[datetime] = None
|
||||
|
||||
|
||||
class TaskCreate(TaskBase):
|
||||
"""Schema for creating a Task."""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class TaskUpdate(BaseModel):
|
||||
"""Schema for updating a Task."""
|
||||
|
||||
title: Optional[str] = Field(None, min_length=1, max_length=255)
|
||||
description: Optional[str] = None
|
||||
status: Optional[TaskStatus] = None
|
||||
priority: Optional[TaskPriority] = None
|
||||
due_date: Optional[datetime] = None
|
||||
|
||||
|
||||
class TaskInDBBase(TaskBase):
|
||||
"""Base schema for Task stored in DB."""
|
||||
|
||||
id: int
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
|
||||
|
||||
class Task(TaskInDBBase):
|
||||
"""Schema for Task output."""
|
||||
|
||||
pass
|
43
main.py
Normal file
43
main.py
Normal file
@ -0,0 +1,43 @@
|
||||
from fastapi import FastAPI
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
|
||||
from app.api.api_v1.api import api_router
|
||||
from app.core.config import settings
|
||||
from app.core.error_handlers import add_exception_handlers
|
||||
|
||||
app = FastAPI(
|
||||
title=settings.PROJECT_NAME,
|
||||
openapi_url=f"{settings.API_V1_STR}/openapi.json",
|
||||
version="0.1.0",
|
||||
description="A simple task management API with FastAPI and SQLite.",
|
||||
docs_url="/docs",
|
||||
redoc_url="/redoc",
|
||||
)
|
||||
|
||||
# Set all CORS enabled origins
|
||||
if settings.BACKEND_CORS_ORIGINS:
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=[str(origin) for origin in settings.BACKEND_CORS_ORIGINS],
|
||||
allow_credentials=True,
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
# Add exception handlers
|
||||
add_exception_handlers(app)
|
||||
|
||||
# Include API router
|
||||
app.include_router(api_router, prefix=settings.API_V1_STR)
|
||||
|
||||
# Root endpoint redirects to docs
|
||||
@app.get("/", include_in_schema=False)
|
||||
async def root():
|
||||
"""Root endpoint redirects to docs."""
|
||||
from fastapi.responses import RedirectResponse
|
||||
return RedirectResponse(url="/docs")
|
||||
|
||||
if __name__ == "__main__":
|
||||
import uvicorn
|
||||
|
||||
uvicorn.run("main:app", host="0.0.0.0", port=8000, reload=True)
|
1
migrations/README
Normal file
1
migrations/README
Normal file
@ -0,0 +1 @@
|
||||
Generic single-database configuration with SQLite.
|
89
migrations/env.py
Normal file
89
migrations/env.py
Normal file
@ -0,0 +1,89 @@
|
||||
import sys
|
||||
from logging.config import fileConfig
|
||||
from pathlib import Path
|
||||
|
||||
from alembic import context
|
||||
from sqlalchemy import engine_from_config, pool
|
||||
|
||||
# this is the Alembic Config object, which provides
|
||||
# access to the values within the .ini file in use.
|
||||
config = context.config
|
||||
|
||||
# Interpret the config file for Python logging.
|
||||
# This line sets up loggers basically.
|
||||
if config.config_file_name is not None:
|
||||
fileConfig(config.config_file_name)
|
||||
|
||||
# add your model's MetaData object here
|
||||
# for 'autogenerate' support
|
||||
# from myapp import mymodel
|
||||
# target_metadata = mymodel.Base.metadata
|
||||
|
||||
# Add the project root directory to the Python path
|
||||
sys.path.insert(0, str(Path(__file__).parent.parent))
|
||||
|
||||
from app.core.database import Base # noqa
|
||||
from app.models.task import Task # noqa
|
||||
|
||||
target_metadata = Base.metadata
|
||||
|
||||
# other values from the config, defined by the needs of env.py,
|
||||
# can be acquired:
|
||||
# my_important_option = config.get_main_option("my_important_option")
|
||||
# ... etc.
|
||||
|
||||
|
||||
def run_migrations_offline() -> None:
|
||||
"""Run migrations in 'offline' mode.
|
||||
|
||||
This configures the context with just a URL
|
||||
and not an Engine, though an Engine is acceptable
|
||||
here as well. By skipping the Engine creation
|
||||
we don't even need a DBAPI to be available.
|
||||
|
||||
Calls to context.execute() here emit the given string to the
|
||||
script output.
|
||||
|
||||
"""
|
||||
url = config.get_main_option("sqlalchemy.url")
|
||||
context.configure(
|
||||
url=url,
|
||||
target_metadata=target_metadata,
|
||||
literal_binds=True,
|
||||
dialect_opts={"paramstyle": "named"},
|
||||
render_as_batch=True, # Required for SQLite
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
def run_migrations_online() -> None:
|
||||
"""Run migrations in 'online' mode.
|
||||
|
||||
In this scenario we need to create an Engine
|
||||
and associate a connection with the context.
|
||||
|
||||
"""
|
||||
connectable = engine_from_config(
|
||||
config.get_section(config.config_ini_section),
|
||||
prefix="sqlalchemy.",
|
||||
poolclass=pool.NullPool,
|
||||
)
|
||||
|
||||
with connectable.connect() as connection:
|
||||
is_sqlite = connection.dialect.name == "sqlite"
|
||||
context.configure(
|
||||
connection=connection,
|
||||
target_metadata=target_metadata,
|
||||
render_as_batch=is_sqlite, # Required for SQLite
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
if context.is_offline_mode():
|
||||
run_migrations_offline()
|
||||
else:
|
||||
run_migrations_online()
|
24
migrations/script.py.mako
Normal file
24
migrations/script.py.mako
Normal file
@ -0,0 +1,24 @@
|
||||
"""${message}
|
||||
|
||||
Revision ID: ${up_revision}
|
||||
Revises: ${down_revision | comma,n}
|
||||
Create Date: ${create_date}
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
${imports if imports else ""}
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = ${repr(up_revision)}
|
||||
down_revision = ${repr(down_revision)}
|
||||
branch_labels = ${repr(branch_labels)}
|
||||
depends_on = ${repr(depends_on)}
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
${upgrades if upgrades else "pass"}
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
${downgrades if downgrades else "pass"}
|
48
migrations/versions/001_initial_migration.py
Normal file
48
migrations/versions/001_initial_migration.py
Normal file
@ -0,0 +1,48 @@
|
||||
"""Initial migration
|
||||
|
||||
Revision ID: 001
|
||||
Revises:
|
||||
Create Date: 2023-09-27
|
||||
|
||||
"""
|
||||
import sqlalchemy as sa
|
||||
from alembic import op
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = '001'
|
||||
down_revision = None
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.create_table('tasks',
|
||||
sa.Column('id', sa.Integer(), nullable=False),
|
||||
sa.Column('title', sa.String(length=255), nullable=False),
|
||||
sa.Column('description', sa.Text(), nullable=True),
|
||||
sa.Column(
|
||||
'status',
|
||||
sa.Enum('todo', 'in_progress', 'done', name='taskstatus'),
|
||||
nullable=True
|
||||
),
|
||||
sa.Column(
|
||||
'priority',
|
||||
sa.Enum('low', 'medium', 'high', name='taskpriority'),
|
||||
nullable=True
|
||||
),
|
||||
sa.Column('due_date', sa.DateTime(), nullable=True),
|
||||
sa.Column('is_deleted', sa.Boolean(), nullable=True),
|
||||
sa.Column('created_at', sa.DateTime(), nullable=True),
|
||||
sa.Column('updated_at', sa.DateTime(), nullable=True),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
op.create_index(op.f('ix_tasks_id'), 'tasks', ['id'], unique=False)
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.drop_index(op.f('ix_tasks_id'), table_name='tasks')
|
||||
op.drop_table('tasks')
|
||||
# ### end Alembic commands ###
|
13
pyproject.toml
Normal file
13
pyproject.toml
Normal file
@ -0,0 +1,13 @@
|
||||
[tool.ruff]
|
||||
line-length = 88
|
||||
target-version = "py39"
|
||||
|
||||
[tool.ruff.lint]
|
||||
select = ["E", "F", "I", "B", "W", "C90"]
|
||||
ignore = ["B008"] # Allow function calls in argument defaults for FastAPI dependencies
|
||||
|
||||
[tool.ruff.lint.isort]
|
||||
known-third-party = ["fastapi", "pydantic", "sqlalchemy", "alembic", "uvicorn"]
|
||||
|
||||
[tool.ruff.lint.mccabe]
|
||||
max-complexity = 10
|
10
requirements.txt
Normal file
10
requirements.txt
Normal file
@ -0,0 +1,10 @@
|
||||
fastapi>=0.103.1
|
||||
pydantic>=2.4.0
|
||||
sqlalchemy>=2.0.21
|
||||
alembic>=1.12.0
|
||||
uvicorn>=0.23.2
|
||||
python-dotenv>=1.0.0
|
||||
ruff>=0.0.287
|
||||
pytest>=7.4.2
|
||||
httpx>=0.25.0
|
||||
pydantic-settings>=2.0.3
|
Loading…
x
Reference in New Issue
Block a user