Create Task Manager API with FastAPI and SQLite
- Set up project structure and dependencies - Create task model and schema - Implement Alembic migrations - Add CRUD API endpoints for task management - Add health endpoint with database connectivity check - Add comprehensive error handling - Add tests for API endpoints - Update README with API documentation
This commit is contained in:
parent
2d1d22ba98
commit
7658939790
129
README.md
129
README.md
@ -1,3 +1,128 @@
|
|||||||
# FastAPI Application
|
# Task Manager API
|
||||||
|
|
||||||
This is a FastAPI application bootstrapped by BackendIM, the AI-powered backend generation platform.
|
A RESTful API for managing tasks built with FastAPI and SQLite.
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
- CRUD operations for tasks
|
||||||
|
- Filtering by task status, priority, and title search
|
||||||
|
- Health check endpoint with database connectivity check
|
||||||
|
- Comprehensive error handling
|
||||||
|
- Database migrations with Alembic
|
||||||
|
- Complete test suite
|
||||||
|
|
||||||
|
## Technical Stack
|
||||||
|
|
||||||
|
- **Python 3.9+**
|
||||||
|
- **FastAPI** - High-performance web framework
|
||||||
|
- **SQLite** - Database
|
||||||
|
- **SQLAlchemy** - ORM
|
||||||
|
- **Alembic** - Database migrations
|
||||||
|
- **Pydantic** - Data validation
|
||||||
|
- **Pytest** - Testing
|
||||||
|
|
||||||
|
## Setup and Installation
|
||||||
|
|
||||||
|
### Prerequisites
|
||||||
|
|
||||||
|
- Python 3.9 or higher
|
||||||
|
|
||||||
|
### Installation
|
||||||
|
|
||||||
|
1. Clone the repository
|
||||||
|
2. Install dependencies:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pip install -r requirements.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
3. Set up environment variables (optional):
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Enable debug mode
|
||||||
|
export DEBUG=True
|
||||||
|
```
|
||||||
|
|
||||||
|
### Database Setup
|
||||||
|
|
||||||
|
The database will be automatically created at `/app/storage/db/db.sqlite` when the application starts. To apply migrations:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
alembic upgrade head
|
||||||
|
```
|
||||||
|
|
||||||
|
### Running the API
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Development mode with auto-reload
|
||||||
|
uvicorn main:app --reload
|
||||||
|
|
||||||
|
# Production mode
|
||||||
|
uvicorn main:app --host 0.0.0.0 --port 8000
|
||||||
|
```
|
||||||
|
|
||||||
|
## API Documentation
|
||||||
|
|
||||||
|
The API documentation is available at `/docs` or `/redoc` when the server is running.
|
||||||
|
|
||||||
|
### Endpoints
|
||||||
|
|
||||||
|
#### Health Check
|
||||||
|
|
||||||
|
```
|
||||||
|
GET /health
|
||||||
|
```
|
||||||
|
|
||||||
|
Returns the health status of the API, including database connectivity.
|
||||||
|
|
||||||
|
#### Tasks
|
||||||
|
|
||||||
|
| Method | Endpoint | Description |
|
||||||
|
|--------|----------|-------------|
|
||||||
|
| GET | `/api/v1/tasks` | List all tasks with optional filtering |
|
||||||
|
| POST | `/api/v1/tasks` | Create a new task |
|
||||||
|
| GET | `/api/v1/tasks/{task_id}` | Get a specific task by ID |
|
||||||
|
| PUT | `/api/v1/tasks/{task_id}` | Update a specific task |
|
||||||
|
| DELETE | `/api/v1/tasks/{task_id}` | Delete a specific task |
|
||||||
|
|
||||||
|
### Filtering Tasks
|
||||||
|
|
||||||
|
The `GET /api/v1/tasks` endpoint supports the following query parameters:
|
||||||
|
|
||||||
|
- `completed` - Filter by completion status (true/false)
|
||||||
|
- `priority` - Filter by priority level (1-3)
|
||||||
|
- `search` - Search by task title
|
||||||
|
- `skip` - Number of items to skip (pagination)
|
||||||
|
- `limit` - Maximum number of items to return (pagination)
|
||||||
|
|
||||||
|
### Task Object Structure
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"id": 1,
|
||||||
|
"title": "Task Title",
|
||||||
|
"description": "Task description",
|
||||||
|
"completed": false,
|
||||||
|
"priority": 2,
|
||||||
|
"due_date": "2023-09-30T00:00:00",
|
||||||
|
"created_at": "2023-09-01T12:00:00",
|
||||||
|
"updated_at": "2023-09-01T12:00:00"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing
|
||||||
|
|
||||||
|
Run the test suite with:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pytest
|
||||||
|
```
|
||||||
|
|
||||||
|
## Development
|
||||||
|
|
||||||
|
### Running Linting
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ruff check .
|
||||||
|
ruff check --fix .
|
||||||
|
```
|
||||||
|
102
alembic.ini
Normal file
102
alembic.ini
Normal file
@ -0,0 +1,102 @@
|
|||||||
|
# A generic, single database configuration.
|
||||||
|
|
||||||
|
[alembic]
|
||||||
|
# path to migration scripts
|
||||||
|
script_location = migrations
|
||||||
|
|
||||||
|
# template used to generate migration files
|
||||||
|
# file_template = %%(rev)s_%%(slug)s
|
||||||
|
|
||||||
|
# sys.path path, will be prepended to sys.path if present.
|
||||||
|
# defaults to the current working directory.
|
||||||
|
prepend_sys_path = .
|
||||||
|
|
||||||
|
# timezone to use when rendering the date within the migration file
|
||||||
|
# as well as the filename.
|
||||||
|
# If specified, requires the python-dateutil library that can be
|
||||||
|
# installed by adding `alembic[tz]` to the pip requirements
|
||||||
|
# string value is passed to dateutil.tz.gettz()
|
||||||
|
# leave blank for localtime
|
||||||
|
# timezone =
|
||||||
|
|
||||||
|
# max length of characters to apply to the
|
||||||
|
# "slug" field
|
||||||
|
# truncate_slug_length = 40
|
||||||
|
|
||||||
|
# set to 'true' to run the environment during
|
||||||
|
# the 'revision' command, regardless of autogenerate
|
||||||
|
# revision_environment = false
|
||||||
|
|
||||||
|
# set to 'true' to allow .pyc and .pyo files without
|
||||||
|
# a source .py file to be detected as revisions in the
|
||||||
|
# versions/ directory
|
||||||
|
# sourceless = false
|
||||||
|
|
||||||
|
# version location specification; This defaults
|
||||||
|
# to migrations/versions. When using multiple version
|
||||||
|
# directories, initial revisions must be specified with --version-path.
|
||||||
|
# The path separator used here should be the separator specified by "version_path_separator" below.
|
||||||
|
# version_locations = %(here)s/bar:%(here)s/bat:migrations/versions
|
||||||
|
|
||||||
|
# version path separator; As mentioned above, this is the character used to split
|
||||||
|
# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep.
|
||||||
|
# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas.
|
||||||
|
# Valid values for version_path_separator are:
|
||||||
|
#
|
||||||
|
# version_path_separator = :
|
||||||
|
# version_path_separator = ;
|
||||||
|
# version_path_separator = space
|
||||||
|
version_path_separator = os # Use os.pathsep. Default configuration used in a Windows environment.
|
||||||
|
|
||||||
|
# the output encoding used when revision files
|
||||||
|
# are written from script.py.mako
|
||||||
|
# output_encoding = utf-8
|
||||||
|
|
||||||
|
# SQLite URL with absolute path
|
||||||
|
sqlalchemy.url = sqlite:////app/storage/db/db.sqlite
|
||||||
|
|
||||||
|
[post_write_hooks]
|
||||||
|
# post_write_hooks defines scripts or Python functions that are run
|
||||||
|
# on newly generated revision scripts. See the documentation for further
|
||||||
|
# detail and examples
|
||||||
|
|
||||||
|
# format using "black" - use the console_scripts runner, against the "black" entrypoint
|
||||||
|
# hooks = black
|
||||||
|
# black.type = console_scripts
|
||||||
|
# black.entrypoint = black
|
||||||
|
# black.options = -l 79 REVISION_SCRIPT_FILENAME
|
||||||
|
|
||||||
|
# Logging configuration
|
||||||
|
[loggers]
|
||||||
|
keys = root,sqlalchemy,alembic
|
||||||
|
|
||||||
|
[handlers]
|
||||||
|
keys = console
|
||||||
|
|
||||||
|
[formatters]
|
||||||
|
keys = generic
|
||||||
|
|
||||||
|
[logger_root]
|
||||||
|
level = WARN
|
||||||
|
handlers = console
|
||||||
|
qualname =
|
||||||
|
|
||||||
|
[logger_sqlalchemy]
|
||||||
|
level = WARN
|
||||||
|
handlers =
|
||||||
|
qualname = sqlalchemy.engine
|
||||||
|
|
||||||
|
[logger_alembic]
|
||||||
|
level = INFO
|
||||||
|
handlers =
|
||||||
|
qualname = alembic
|
||||||
|
|
||||||
|
[handler_console]
|
||||||
|
class = StreamHandler
|
||||||
|
args = (sys.stderr,)
|
||||||
|
level = NOTSET
|
||||||
|
formatter = generic
|
||||||
|
|
||||||
|
[formatter_generic]
|
||||||
|
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||||
|
datefmt = %H:%M:%S
|
0
app/__init__.py
Normal file
0
app/__init__.py
Normal file
0
app/api/__init__.py
Normal file
0
app/api/__init__.py
Normal file
0
app/api/v1/__init__.py
Normal file
0
app/api/v1/__init__.py
Normal file
8
app/api/v1/api.py
Normal file
8
app/api/v1/api.py
Normal file
@ -0,0 +1,8 @@
|
|||||||
|
from fastapi import APIRouter
|
||||||
|
|
||||||
|
from app.api.v1 import tasks
|
||||||
|
|
||||||
|
api_router = APIRouter()
|
||||||
|
|
||||||
|
# Include task routes
|
||||||
|
api_router.include_router(tasks.router, prefix="/tasks", tags=["tasks"])
|
108
app/api/v1/tasks.py
Normal file
108
app/api/v1/tasks.py
Normal file
@ -0,0 +1,108 @@
|
|||||||
|
from typing import Any, List, Optional
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, status
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
|
||||||
|
from app import schemas, crud
|
||||||
|
from app.db.session import get_db
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/", response_model=List[schemas.Task])
|
||||||
|
def read_tasks(
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
skip: int = 0,
|
||||||
|
limit: int = 100,
|
||||||
|
completed: Optional[bool] = None,
|
||||||
|
priority: Optional[int] = None,
|
||||||
|
search: Optional[str] = None,
|
||||||
|
) -> Any:
|
||||||
|
"""
|
||||||
|
Retrieve tasks with optional filtering.
|
||||||
|
"""
|
||||||
|
if completed is not None:
|
||||||
|
tasks = crud.task.get_multi_by_completed(
|
||||||
|
db, completed=completed, skip=skip, limit=limit
|
||||||
|
)
|
||||||
|
elif priority is not None:
|
||||||
|
tasks = crud.task.get_multi_by_priority(
|
||||||
|
db, priority=priority, skip=skip, limit=limit
|
||||||
|
)
|
||||||
|
elif search is not None:
|
||||||
|
tasks = crud.task.search_by_title(
|
||||||
|
db, title=search, skip=skip, limit=limit
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
tasks = crud.task.get_multi(db, skip=skip, limit=limit)
|
||||||
|
return tasks
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/", response_model=schemas.Task, status_code=status.HTTP_201_CREATED)
|
||||||
|
def create_task(
|
||||||
|
*,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
task_in: schemas.TaskCreate,
|
||||||
|
) -> Any:
|
||||||
|
"""
|
||||||
|
Create new task.
|
||||||
|
"""
|
||||||
|
task = crud.task.create(db=db, obj_in=task_in)
|
||||||
|
return task
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{task_id}", response_model=schemas.Task)
|
||||||
|
def read_task(
|
||||||
|
*,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
task_id: int,
|
||||||
|
) -> Any:
|
||||||
|
"""
|
||||||
|
Get task by ID.
|
||||||
|
"""
|
||||||
|
task = crud.task.get(db=db, id=task_id)
|
||||||
|
if not task:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail="Task not found"
|
||||||
|
)
|
||||||
|
return task
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{task_id}", response_model=schemas.Task)
|
||||||
|
def update_task(
|
||||||
|
*,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
task_id: int,
|
||||||
|
task_in: schemas.TaskUpdate,
|
||||||
|
) -> Any:
|
||||||
|
"""
|
||||||
|
Update a task.
|
||||||
|
"""
|
||||||
|
task = crud.task.get(db=db, id=task_id)
|
||||||
|
if not task:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail="Task not found"
|
||||||
|
)
|
||||||
|
task = crud.task.update(db=db, db_obj=task, obj_in=task_in)
|
||||||
|
return task
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{task_id}", status_code=status.HTTP_204_NO_CONTENT, response_model=None)
|
||||||
|
def delete_task(
|
||||||
|
*,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
task_id: int,
|
||||||
|
) -> Any:
|
||||||
|
"""
|
||||||
|
Delete a task.
|
||||||
|
"""
|
||||||
|
task = crud.task.get(db=db, id=task_id)
|
||||||
|
if not task:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail="Task not found"
|
||||||
|
)
|
||||||
|
crud.task.remove(db=db, id=task_id)
|
||||||
|
return None
|
0
app/core/__init__.py
Normal file
0
app/core/__init__.py
Normal file
22
app/core/config.py
Normal file
22
app/core/config.py
Normal file
@ -0,0 +1,22 @@
|
|||||||
|
import os
|
||||||
|
from pathlib import Path
|
||||||
|
from pydantic_settings import BaseSettings
|
||||||
|
|
||||||
|
class Settings(BaseSettings):
|
||||||
|
# API settings
|
||||||
|
PROJECT_NAME: str = "Task Manager API"
|
||||||
|
API_V1_STR: str = "/api/v1"
|
||||||
|
DEBUG: bool = os.getenv("DEBUG", "False").lower() in ("true", "1", "t")
|
||||||
|
|
||||||
|
# SQLite Database settings
|
||||||
|
DB_DIR: Path = Path("/app") / "storage" / "db"
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
env_file = ".env"
|
||||||
|
case_sensitive = True
|
||||||
|
|
||||||
|
# Create settings instance
|
||||||
|
settings = Settings()
|
||||||
|
|
||||||
|
# Ensure DB directory exists
|
||||||
|
settings.DB_DIR.mkdir(parents=True, exist_ok=True)
|
62
app/core/exceptions.py
Normal file
62
app/core/exceptions.py
Normal file
@ -0,0 +1,62 @@
|
|||||||
|
from fastapi import HTTPException, Request, status
|
||||||
|
from fastapi.exceptions import RequestValidationError
|
||||||
|
from fastapi.responses import JSONResponse
|
||||||
|
from starlette.exceptions import HTTPException as StarletteHTTPException
|
||||||
|
|
||||||
|
from app.core.config import settings
|
||||||
|
|
||||||
|
|
||||||
|
async def http_exception_handler(request: Request, exc: HTTPException) -> JSONResponse:
|
||||||
|
"""
|
||||||
|
Handle HTTPExceptions and return a consistent response format.
|
||||||
|
"""
|
||||||
|
return JSONResponse(
|
||||||
|
status_code=exc.status_code,
|
||||||
|
content={
|
||||||
|
"success": False,
|
||||||
|
"message": exc.detail,
|
||||||
|
"data": None,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
async def validation_exception_handler(request: Request, exc: RequestValidationError) -> JSONResponse:
|
||||||
|
"""
|
||||||
|
Handle validation errors and return a consistent response format.
|
||||||
|
"""
|
||||||
|
return JSONResponse(
|
||||||
|
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
|
||||||
|
content={
|
||||||
|
"success": False,
|
||||||
|
"message": "Validation error",
|
||||||
|
"data": None,
|
||||||
|
"errors": exc.errors(),
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
async def unexpected_exception_handler(request: Request, exc: Exception) -> JSONResponse:
|
||||||
|
"""
|
||||||
|
Handle all other exceptions with a generic 500 error.
|
||||||
|
"""
|
||||||
|
return JSONResponse(
|
||||||
|
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||||
|
content={
|
||||||
|
"success": False,
|
||||||
|
"message": "Internal server error",
|
||||||
|
"data": None,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def register_exception_handlers(app):
|
||||||
|
"""
|
||||||
|
Register exception handlers with the FastAPI app.
|
||||||
|
"""
|
||||||
|
app.add_exception_handler(StarletteHTTPException, http_exception_handler)
|
||||||
|
app.add_exception_handler(HTTPException, http_exception_handler)
|
||||||
|
app.add_exception_handler(RequestValidationError, validation_exception_handler)
|
||||||
|
|
||||||
|
# Only register generic handler in production
|
||||||
|
if not settings.DEBUG:
|
||||||
|
app.add_exception_handler(Exception, unexpected_exception_handler)
|
1
app/crud/__init__.py
Normal file
1
app/crud/__init__.py
Normal file
@ -0,0 +1 @@
|
|||||||
|
from app.crud.task import task # noqa
|
64
app/crud/base.py
Normal file
64
app/crud/base.py
Normal file
@ -0,0 +1,64 @@
|
|||||||
|
from typing import Any, Dict, Generic, List, Optional, Type, TypeVar, Union
|
||||||
|
|
||||||
|
from fastapi.encoders import jsonable_encoder
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
|
||||||
|
from app.db.base_class import Base
|
||||||
|
|
||||||
|
ModelType = TypeVar("ModelType", bound=Base)
|
||||||
|
CreateSchemaType = TypeVar("CreateSchemaType", bound=BaseModel)
|
||||||
|
UpdateSchemaType = TypeVar("UpdateSchemaType", bound=BaseModel)
|
||||||
|
|
||||||
|
|
||||||
|
class CRUDBase(Generic[ModelType, CreateSchemaType, UpdateSchemaType]):
|
||||||
|
def __init__(self, model: Type[ModelType]):
|
||||||
|
"""
|
||||||
|
CRUD object with default methods to Create, Read, Update, Delete (CRUD).
|
||||||
|
**Parameters**
|
||||||
|
* `model`: A SQLAlchemy model class
|
||||||
|
* `schema`: A Pydantic model (schema) class
|
||||||
|
"""
|
||||||
|
self.model = model
|
||||||
|
|
||||||
|
def get(self, db: Session, id: Any) -> Optional[ModelType]:
|
||||||
|
return db.query(self.model).filter(self.model.id == id).first()
|
||||||
|
|
||||||
|
def get_multi(
|
||||||
|
self, db: Session, *, skip: int = 0, limit: int = 100
|
||||||
|
) -> List[ModelType]:
|
||||||
|
return db.query(self.model).offset(skip).limit(limit).all()
|
||||||
|
|
||||||
|
def create(self, db: Session, *, obj_in: CreateSchemaType) -> ModelType:
|
||||||
|
obj_in_data = jsonable_encoder(obj_in)
|
||||||
|
db_obj = self.model(**obj_in_data)
|
||||||
|
db.add(db_obj)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(db_obj)
|
||||||
|
return db_obj
|
||||||
|
|
||||||
|
def update(
|
||||||
|
self,
|
||||||
|
db: Session,
|
||||||
|
*,
|
||||||
|
db_obj: ModelType,
|
||||||
|
obj_in: Union[UpdateSchemaType, Dict[str, Any]]
|
||||||
|
) -> ModelType:
|
||||||
|
obj_data = jsonable_encoder(db_obj)
|
||||||
|
if isinstance(obj_in, dict):
|
||||||
|
update_data = obj_in
|
||||||
|
else:
|
||||||
|
update_data = obj_in.model_dump(exclude_unset=True)
|
||||||
|
for field in obj_data:
|
||||||
|
if field in update_data:
|
||||||
|
setattr(db_obj, field, update_data[field])
|
||||||
|
db.add(db_obj)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(db_obj)
|
||||||
|
return db_obj
|
||||||
|
|
||||||
|
def remove(self, db: Session, *, id: int) -> ModelType:
|
||||||
|
obj = db.query(self.model).get(id)
|
||||||
|
db.delete(obj)
|
||||||
|
db.commit()
|
||||||
|
return obj
|
45
app/crud/task.py
Normal file
45
app/crud/task.py
Normal file
@ -0,0 +1,45 @@
|
|||||||
|
from typing import List
|
||||||
|
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
|
||||||
|
from app.crud.base import CRUDBase
|
||||||
|
from app.models.task import Task
|
||||||
|
from app.schemas.task import TaskCreate, TaskUpdate
|
||||||
|
|
||||||
|
|
||||||
|
class CRUDTask(CRUDBase[Task, TaskCreate, TaskUpdate]):
|
||||||
|
def get_multi_by_completed(
|
||||||
|
self, db: Session, *, completed: bool, skip: int = 0, limit: int = 100
|
||||||
|
) -> List[Task]:
|
||||||
|
return (
|
||||||
|
db.query(self.model)
|
||||||
|
.filter(Task.completed == completed)
|
||||||
|
.offset(skip)
|
||||||
|
.limit(limit)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_multi_by_priority(
|
||||||
|
self, db: Session, *, priority: int, skip: int = 0, limit: int = 100
|
||||||
|
) -> List[Task]:
|
||||||
|
return (
|
||||||
|
db.query(self.model)
|
||||||
|
.filter(Task.priority == priority)
|
||||||
|
.offset(skip)
|
||||||
|
.limit(limit)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
def search_by_title(
|
||||||
|
self, db: Session, *, title: str, skip: int = 0, limit: int = 100
|
||||||
|
) -> List[Task]:
|
||||||
|
return (
|
||||||
|
db.query(self.model)
|
||||||
|
.filter(Task.title.ilike(f"%{title}%"))
|
||||||
|
.offset(skip)
|
||||||
|
.limit(limit)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
task = CRUDTask(Task)
|
0
app/db/__init__.py
Normal file
0
app/db/__init__.py
Normal file
4
app/db/base.py
Normal file
4
app/db/base.py
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
# Import all the models here so Alembic can detect them
|
||||||
|
from app.db.base_class import Base # noqa
|
||||||
|
# Import all models below
|
||||||
|
from app.models.task import Task # noqa
|
13
app/db/base_class.py
Normal file
13
app/db/base_class.py
Normal file
@ -0,0 +1,13 @@
|
|||||||
|
from typing import Any
|
||||||
|
from sqlalchemy.ext.declarative import declared_attr
|
||||||
|
from sqlalchemy.orm import as_declarative
|
||||||
|
|
||||||
|
@as_declarative()
|
||||||
|
class Base:
|
||||||
|
id: Any
|
||||||
|
__name__: str
|
||||||
|
|
||||||
|
# Generate __tablename__ automatically
|
||||||
|
@declared_attr
|
||||||
|
def __tablename__(cls) -> str:
|
||||||
|
return cls.__name__.lower()
|
28
app/db/session.py
Normal file
28
app/db/session.py
Normal file
@ -0,0 +1,28 @@
|
|||||||
|
from sqlalchemy import create_engine
|
||||||
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
|
from sqlalchemy.orm import sessionmaker
|
||||||
|
|
||||||
|
from app.core.config import settings
|
||||||
|
|
||||||
|
# SQLite database URL
|
||||||
|
SQLALCHEMY_DATABASE_URL = f"sqlite:///{settings.DB_DIR}/db.sqlite"
|
||||||
|
|
||||||
|
# Create SQLAlchemy engine
|
||||||
|
engine = create_engine(
|
||||||
|
SQLALCHEMY_DATABASE_URL,
|
||||||
|
connect_args={"check_same_thread": False} # Only needed for SQLite
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create session factory
|
||||||
|
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||||
|
|
||||||
|
# Create base class for models
|
||||||
|
Base = declarative_base()
|
||||||
|
|
||||||
|
# Dependency for getting DB session
|
||||||
|
def get_db():
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
yield db
|
||||||
|
finally:
|
||||||
|
db.close()
|
0
app/models/__init__.py
Normal file
0
app/models/__init__.py
Normal file
15
app/models/task.py
Normal file
15
app/models/task.py
Normal file
@ -0,0 +1,15 @@
|
|||||||
|
from sqlalchemy import Column, Integer, String, Boolean, DateTime, Text
|
||||||
|
from sqlalchemy.sql import func
|
||||||
|
|
||||||
|
from app.db.base_class import Base
|
||||||
|
|
||||||
|
|
||||||
|
class Task(Base):
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
title = Column(String(255), nullable=False, index=True)
|
||||||
|
description = Column(Text, nullable=True)
|
||||||
|
completed = Column(Boolean, default=False)
|
||||||
|
priority = Column(Integer, default=1) # 1=Low, 2=Medium, 3=High
|
||||||
|
due_date = Column(DateTime, nullable=True)
|
||||||
|
created_at = Column(DateTime, default=func.now(), nullable=False)
|
||||||
|
updated_at = Column(DateTime, default=func.now(), onupdate=func.now(), nullable=False)
|
1
app/schemas/__init__.py
Normal file
1
app/schemas/__init__.py
Normal file
@ -0,0 +1 @@
|
|||||||
|
from app.schemas.task import Task, TaskCreate, TaskUpdate # noqa
|
42
app/schemas/task.py
Normal file
42
app/schemas/task.py
Normal file
@ -0,0 +1,42 @@
|
|||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
|
||||||
|
|
||||||
|
# Shared properties
|
||||||
|
class TaskBase(BaseModel):
|
||||||
|
title: str = Field(..., min_length=1, max_length=255, description="Task title")
|
||||||
|
description: Optional[str] = Field(None, description="Task description")
|
||||||
|
completed: Optional[bool] = Field(False, description="Task completion status")
|
||||||
|
priority: Optional[int] = Field(1, ge=1, le=3, description="Task priority: 1-Low, 2-Medium, 3-High")
|
||||||
|
due_date: Optional[datetime] = Field(None, description="Task due date")
|
||||||
|
|
||||||
|
|
||||||
|
# Properties to receive on task creation
|
||||||
|
class TaskCreate(TaskBase):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
# Properties to receive on task update
|
||||||
|
class TaskUpdate(BaseModel):
|
||||||
|
title: Optional[str] = Field(None, min_length=1, max_length=255, description="Task title")
|
||||||
|
description: Optional[str] = Field(None, description="Task description")
|
||||||
|
completed: Optional[bool] = Field(None, description="Task completion status")
|
||||||
|
priority: Optional[int] = Field(None, ge=1, le=3, description="Task priority: 1-Low, 2-Medium, 3-High")
|
||||||
|
due_date: Optional[datetime] = Field(None, description="Task due date")
|
||||||
|
|
||||||
|
|
||||||
|
# Properties shared by models stored in DB
|
||||||
|
class TaskInDBBase(TaskBase):
|
||||||
|
id: int
|
||||||
|
created_at: datetime
|
||||||
|
updated_at: datetime
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
|
||||||
|
|
||||||
|
# Properties to return to client
|
||||||
|
class Task(TaskInDBBase):
|
||||||
|
pass
|
54
main.py
Normal file
54
main.py
Normal file
@ -0,0 +1,54 @@
|
|||||||
|
import uvicorn
|
||||||
|
from fastapi import FastAPI
|
||||||
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
|
|
||||||
|
from app.api.v1.api import api_router
|
||||||
|
from app.core.config import settings
|
||||||
|
from app.core.exceptions import register_exception_handlers
|
||||||
|
|
||||||
|
app = FastAPI(
|
||||||
|
title=settings.PROJECT_NAME,
|
||||||
|
description="Task Manager API",
|
||||||
|
version="0.1.0",
|
||||||
|
openapi_url="/openapi.json",
|
||||||
|
docs_url="/docs",
|
||||||
|
redoc_url="/redoc",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Set up CORS
|
||||||
|
app.add_middleware(
|
||||||
|
CORSMiddleware,
|
||||||
|
allow_origins=["*"],
|
||||||
|
allow_credentials=True,
|
||||||
|
allow_methods=["*"],
|
||||||
|
allow_headers=["*"],
|
||||||
|
)
|
||||||
|
|
||||||
|
# Register exception handlers
|
||||||
|
register_exception_handlers(app)
|
||||||
|
|
||||||
|
# Include routers
|
||||||
|
app.include_router(api_router, prefix="/api/v1")
|
||||||
|
|
||||||
|
# Health check endpoint
|
||||||
|
@app.get("/health", tags=["health"])
|
||||||
|
async def health_check():
|
||||||
|
from sqlalchemy import text
|
||||||
|
from app.db.session import engine
|
||||||
|
|
||||||
|
# Check database connectivity
|
||||||
|
try:
|
||||||
|
with engine.connect() as connection:
|
||||||
|
connection.execute(text("SELECT 1"))
|
||||||
|
db_status = "connected"
|
||||||
|
except Exception as e:
|
||||||
|
db_status = f"error: {str(e)}"
|
||||||
|
|
||||||
|
return {
|
||||||
|
"status": "healthy",
|
||||||
|
"database": db_status,
|
||||||
|
"version": "0.1.0"
|
||||||
|
}
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
uvicorn.run("main:app", host="0.0.0.0", port=8000, reload=True)
|
1
migrations/README
Normal file
1
migrations/README
Normal file
@ -0,0 +1 @@
|
|||||||
|
Generic single-database configuration with SQLite.
|
80
migrations/env.py
Normal file
80
migrations/env.py
Normal file
@ -0,0 +1,80 @@
|
|||||||
|
from logging.config import fileConfig
|
||||||
|
|
||||||
|
from sqlalchemy import engine_from_config
|
||||||
|
from sqlalchemy import pool
|
||||||
|
|
||||||
|
from alembic import context
|
||||||
|
|
||||||
|
# this is the Alembic Config object, which provides
|
||||||
|
# access to the values within the .ini file in use.
|
||||||
|
config = context.config
|
||||||
|
|
||||||
|
# Interpret the config file for Python logging.
|
||||||
|
# This line sets up loggers basically.
|
||||||
|
if config.config_file_name is not None:
|
||||||
|
fileConfig(config.config_file_name)
|
||||||
|
|
||||||
|
# add your model's MetaData object here
|
||||||
|
# for 'autogenerate' support
|
||||||
|
from app.db.base import Base # noqa
|
||||||
|
target_metadata = Base.metadata
|
||||||
|
|
||||||
|
# other values from the config, defined by the needs of env.py,
|
||||||
|
# can be acquired:
|
||||||
|
# my_important_option = config.get_main_option("my_important_option")
|
||||||
|
# ... etc.
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_offline() -> None:
|
||||||
|
"""Run migrations in 'offline' mode.
|
||||||
|
|
||||||
|
This configures the context with just a URL
|
||||||
|
and not an Engine, though an Engine is acceptable
|
||||||
|
here as well. By skipping the Engine creation
|
||||||
|
we don't even need a DBAPI to be available.
|
||||||
|
|
||||||
|
Calls to context.execute() here emit the given string to the
|
||||||
|
script output.
|
||||||
|
|
||||||
|
"""
|
||||||
|
url = config.get_main_option("sqlalchemy.url")
|
||||||
|
context.configure(
|
||||||
|
url=url,
|
||||||
|
target_metadata=target_metadata,
|
||||||
|
literal_binds=True,
|
||||||
|
dialect_opts={"paramstyle": "named"},
|
||||||
|
)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_online() -> None:
|
||||||
|
"""Run migrations in 'online' mode.
|
||||||
|
|
||||||
|
In this scenario we need to create an Engine
|
||||||
|
and associate a connection with the context.
|
||||||
|
|
||||||
|
"""
|
||||||
|
connectable = engine_from_config(
|
||||||
|
config.get_section(config.config_ini_section, {}),
|
||||||
|
prefix="sqlalchemy.",
|
||||||
|
poolclass=pool.NullPool,
|
||||||
|
)
|
||||||
|
|
||||||
|
with connectable.connect() as connection:
|
||||||
|
is_sqlite = connection.dialect.name == 'sqlite'
|
||||||
|
context.configure(
|
||||||
|
connection=connection,
|
||||||
|
target_metadata=target_metadata,
|
||||||
|
render_as_batch=is_sqlite, # Key configuration for SQLite
|
||||||
|
)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
if context.is_offline_mode():
|
||||||
|
run_migrations_offline()
|
||||||
|
else:
|
||||||
|
run_migrations_online()
|
24
migrations/script.py.mako
Normal file
24
migrations/script.py.mako
Normal file
@ -0,0 +1,24 @@
|
|||||||
|
"""${message}
|
||||||
|
|
||||||
|
Revision ID: ${up_revision}
|
||||||
|
Revises: ${down_revision | comma,n}
|
||||||
|
Create Date: ${create_date}
|
||||||
|
|
||||||
|
"""
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
${imports if imports else ""}
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = ${repr(up_revision)}
|
||||||
|
down_revision = ${repr(down_revision)}
|
||||||
|
branch_labels = ${repr(branch_labels)}
|
||||||
|
depends_on = ${repr(depends_on)}
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
${upgrades if upgrades else "pass"}
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
${downgrades if downgrades else "pass"}
|
43
migrations/versions/78b33b9de3eb_create_tasks_table.py
Normal file
43
migrations/versions/78b33b9de3eb_create_tasks_table.py
Normal file
@ -0,0 +1,43 @@
|
|||||||
|
"""create tasks table
|
||||||
|
|
||||||
|
Revision ID: 78b33b9de3eb
|
||||||
|
Revises:
|
||||||
|
Create Date: 2023-09-01 12:00:00.000000
|
||||||
|
|
||||||
|
"""
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '78b33b9de3eb'
|
||||||
|
down_revision = None
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
# Create tasks table
|
||||||
|
op.create_table(
|
||||||
|
'task',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('title', sa.String(255), nullable=False),
|
||||||
|
sa.Column('description', sa.Text(), nullable=True),
|
||||||
|
sa.Column('completed', sa.Boolean(), default=False),
|
||||||
|
sa.Column('priority', sa.Integer(), default=1),
|
||||||
|
sa.Column('due_date', sa.DateTime(), nullable=True),
|
||||||
|
sa.Column('created_at', sa.DateTime(), nullable=False, server_default=sa.func.now()),
|
||||||
|
sa.Column('updated_at', sa.DateTime(), nullable=False, server_default=sa.func.now(), onupdate=sa.func.now()),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create index on title for faster lookups
|
||||||
|
op.create_index(op.f('ix_task_id'), 'task', ['id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_task_title'), 'task', ['title'], unique=False)
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
# Drop tasks table
|
||||||
|
op.drop_index(op.f('ix_task_title'), table_name='task')
|
||||||
|
op.drop_index(op.f('ix_task_id'), table_name='task')
|
||||||
|
op.drop_table('task')
|
12
requirements.txt
Normal file
12
requirements.txt
Normal file
@ -0,0 +1,12 @@
|
|||||||
|
fastapi>=0.103.1
|
||||||
|
uvicorn>=0.23.2
|
||||||
|
sqlalchemy>=2.0.20
|
||||||
|
alembic>=1.12.0
|
||||||
|
pydantic>=2.3.0
|
||||||
|
pydantic-settings>=2.0.3
|
||||||
|
python-dotenv>=1.0.0
|
||||||
|
python-multipart>=0.0.6
|
||||||
|
email-validator>=2.0.0
|
||||||
|
ruff>=0.0.290
|
||||||
|
pytest>=7.4.0
|
||||||
|
httpx>=0.24.1
|
0
tests/__init__.py
Normal file
0
tests/__init__.py
Normal file
53
tests/conftest.py
Normal file
53
tests/conftest.py
Normal file
@ -0,0 +1,53 @@
|
|||||||
|
import pytest
|
||||||
|
from sqlalchemy import create_engine
|
||||||
|
from sqlalchemy.orm import sessionmaker
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
|
from app.db.base import Base
|
||||||
|
from app.db.session import get_db
|
||||||
|
from main import app
|
||||||
|
|
||||||
|
|
||||||
|
# Create test database
|
||||||
|
TEST_SQLALCHEMY_DATABASE_URL = "sqlite:///./test_db.sqlite"
|
||||||
|
|
||||||
|
engine = create_engine(
|
||||||
|
TEST_SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False}
|
||||||
|
)
|
||||||
|
TestingSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture(scope="function")
|
||||||
|
def db():
|
||||||
|
"""
|
||||||
|
Create a fresh database for each test.
|
||||||
|
"""
|
||||||
|
# Create tables
|
||||||
|
Base.metadata.create_all(bind=engine)
|
||||||
|
|
||||||
|
# Create session
|
||||||
|
db = TestingSessionLocal()
|
||||||
|
try:
|
||||||
|
yield db
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
|
||||||
|
# Drop tables after test
|
||||||
|
Base.metadata.drop_all(bind=engine)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture(scope="function")
|
||||||
|
def client(db):
|
||||||
|
"""
|
||||||
|
Create a test client with a test database session.
|
||||||
|
"""
|
||||||
|
def override_get_db():
|
||||||
|
try:
|
||||||
|
yield db
|
||||||
|
finally:
|
||||||
|
pass
|
||||||
|
|
||||||
|
app.dependency_overrides[get_db] = override_get_db
|
||||||
|
with TestClient(app) as c:
|
||||||
|
yield c
|
||||||
|
app.dependency_overrides.clear()
|
14
tests/test_health.py
Normal file
14
tests/test_health.py
Normal file
@ -0,0 +1,14 @@
|
|||||||
|
from fastapi import status
|
||||||
|
|
||||||
|
|
||||||
|
def test_health_endpoint(client):
|
||||||
|
"""
|
||||||
|
Test the health endpoint returns correct response.
|
||||||
|
"""
|
||||||
|
response = client.get("/health")
|
||||||
|
data = response.json()
|
||||||
|
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert data["status"] == "healthy"
|
||||||
|
assert "database" in data
|
||||||
|
assert "version" in data
|
152
tests/test_tasks.py
Normal file
152
tests/test_tasks.py
Normal file
@ -0,0 +1,152 @@
|
|||||||
|
from fastapi import status
|
||||||
|
|
||||||
|
from app.models.task import Task
|
||||||
|
|
||||||
|
|
||||||
|
def test_create_task(client, db):
|
||||||
|
"""
|
||||||
|
Test creating a new task.
|
||||||
|
"""
|
||||||
|
task_data = {
|
||||||
|
"title": "Test Task",
|
||||||
|
"description": "This is a test task",
|
||||||
|
"priority": 2
|
||||||
|
}
|
||||||
|
|
||||||
|
response = client.post("/api/v1/tasks/", json=task_data)
|
||||||
|
data = response.json()
|
||||||
|
|
||||||
|
assert response.status_code == status.HTTP_201_CREATED
|
||||||
|
assert data["title"] == task_data["title"]
|
||||||
|
assert data["description"] == task_data["description"]
|
||||||
|
assert data["priority"] == task_data["priority"]
|
||||||
|
assert "id" in data
|
||||||
|
assert "created_at" in data
|
||||||
|
|
||||||
|
|
||||||
|
def test_read_tasks(client, db):
|
||||||
|
"""
|
||||||
|
Test getting a list of tasks.
|
||||||
|
"""
|
||||||
|
# Create test tasks
|
||||||
|
task1 = Task(
|
||||||
|
title="Task 1",
|
||||||
|
description="Description 1",
|
||||||
|
priority=1
|
||||||
|
)
|
||||||
|
task2 = Task(
|
||||||
|
title="Task 2",
|
||||||
|
description="Description 2",
|
||||||
|
priority=2
|
||||||
|
)
|
||||||
|
db.add(task1)
|
||||||
|
db.add(task2)
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
response = client.get("/api/v1/tasks/")
|
||||||
|
data = response.json()
|
||||||
|
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert len(data) == 2
|
||||||
|
assert data[0]["title"] == "Task 1"
|
||||||
|
assert data[1]["title"] == "Task 2"
|
||||||
|
|
||||||
|
|
||||||
|
def test_read_task(client, db):
|
||||||
|
"""
|
||||||
|
Test getting a specific task.
|
||||||
|
"""
|
||||||
|
# Create test task
|
||||||
|
task = Task(
|
||||||
|
title="Test Task",
|
||||||
|
description="Test Description",
|
||||||
|
priority=2
|
||||||
|
)
|
||||||
|
db.add(task)
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
response = client.get(f"/api/v1/tasks/{task.id}")
|
||||||
|
data = response.json()
|
||||||
|
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert data["title"] == task.title
|
||||||
|
assert data["description"] == task.description
|
||||||
|
assert data["priority"] == task.priority
|
||||||
|
|
||||||
|
|
||||||
|
def test_update_task(client, db):
|
||||||
|
"""
|
||||||
|
Test updating a task.
|
||||||
|
"""
|
||||||
|
# Create test task
|
||||||
|
task = Task(
|
||||||
|
title="Old Title",
|
||||||
|
description="Old Description",
|
||||||
|
priority=1
|
||||||
|
)
|
||||||
|
db.add(task)
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
update_data = {
|
||||||
|
"title": "New Title",
|
||||||
|
"description": "New Description",
|
||||||
|
"priority": 3
|
||||||
|
}
|
||||||
|
|
||||||
|
response = client.put(f"/api/v1/tasks/{task.id}", json=update_data)
|
||||||
|
data = response.json()
|
||||||
|
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert data["title"] == update_data["title"]
|
||||||
|
assert data["description"] == update_data["description"]
|
||||||
|
assert data["priority"] == update_data["priority"]
|
||||||
|
|
||||||
|
|
||||||
|
def test_delete_task(client, db):
|
||||||
|
"""
|
||||||
|
Test deleting a task.
|
||||||
|
"""
|
||||||
|
# Create test task
|
||||||
|
task = Task(
|
||||||
|
title="Task to Delete",
|
||||||
|
description="This task will be deleted",
|
||||||
|
priority=2
|
||||||
|
)
|
||||||
|
db.add(task)
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
response = client.delete(f"/api/v1/tasks/{task.id}")
|
||||||
|
|
||||||
|
assert response.status_code == status.HTTP_204_NO_CONTENT
|
||||||
|
|
||||||
|
# Verify task was deleted
|
||||||
|
task_in_db = db.query(Task).filter(Task.id == task.id).first()
|
||||||
|
assert task_in_db is None
|
||||||
|
|
||||||
|
|
||||||
|
def test_filter_tasks_by_completed(client, db):
|
||||||
|
"""
|
||||||
|
Test filtering tasks by completed status.
|
||||||
|
"""
|
||||||
|
# Create test tasks
|
||||||
|
task1 = Task(
|
||||||
|
title="Task 1",
|
||||||
|
description="Description 1",
|
||||||
|
completed=True
|
||||||
|
)
|
||||||
|
task2 = Task(
|
||||||
|
title="Task 2",
|
||||||
|
description="Description 2",
|
||||||
|
completed=False
|
||||||
|
)
|
||||||
|
db.add(task1)
|
||||||
|
db.add(task2)
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
response = client.get("/api/v1/tasks/?completed=true")
|
||||||
|
data = response.json()
|
||||||
|
|
||||||
|
assert response.status_code == status.HTTP_200_OK
|
||||||
|
assert len(data) == 1
|
||||||
|
assert data[0]["title"] == "Task 1"
|
||||||
|
assert data[0]["completed"] is True
|
Loading…
x
Reference in New Issue
Block a user