Update endpoint paths and lint codebase
This commit is contained in:
parent
f48a7bb12c
commit
ab3e9f3a5c
46 changed files with 3064 additions and 5122 deletions
|
@ -2,16 +2,34 @@ name: CI
|
|||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- "*"
|
||||
pull_request:
|
||||
branches:
|
||||
- main
|
||||
|
||||
jobs:
|
||||
lint-and-test:
|
||||
name: Lint & Test
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
- name: Install uv
|
||||
run: curl -LsSf https://astral.sh/uv/install.sh | sh
|
||||
- name: Add uv to PATH
|
||||
run: echo "$HOME/.cargo/bin" >> $GITHUB_PATH
|
||||
- name: Install dependencies
|
||||
run: uv sync --all-groups
|
||||
- name: Run mypy type check
|
||||
run: uv run mypy .
|
||||
- name: Run ruff check
|
||||
run: uvx ruff check .
|
||||
- name: Run ruff format
|
||||
run: uvx ruff format --check .
|
||||
- name: Run pytest
|
||||
run: uv run pytest --verbose --tb=short
|
||||
|
||||
build-and-push:
|
||||
name: Build and push Docker image
|
||||
runs-on: ubuntu-latest
|
||||
needs: [lint-and-test]
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write
|
||||
|
|
2
.gitignore
vendored
2
.gitignore
vendored
|
@ -1,5 +1,5 @@
|
|||
# ---> Python
|
||||
# Byte-compiled / optimized / DLL files
|
||||
# Byte-compiled / optimised / DLL files
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
|
|
|
@ -86,7 +86,7 @@ This project follows a unified single-server architecture:
|
|||
- **Single FastAPI application** serving multiple tool endpoints
|
||||
- **Modular tool design** with individual tools in `openapi_mcp_server/tools/`
|
||||
- **OpenAPI compliance** with full documentation and validation
|
||||
- **Docker-ready** with optimized multi-stage builds
|
||||
- **Docker-ready** with optimised multi-stage builds
|
||||
- **uv package management** for fast, reliable dependency handling
|
||||
|
||||
## License
|
||||
|
|
|
@ -9,7 +9,7 @@ issue tracking within your Multi-agent Conversation Platform.
|
|||
This tool allows you to programmatically interact with Forgejo and Gitea servers. It provides
|
||||
endpoints to:
|
||||
|
||||
- Retrieve server information (e.g., version).
|
||||
- Retrieve server information (e.g. version).
|
||||
- Perform repository operations (list, search, read file content, list branches, list commits).
|
||||
- Interact with Forgejo Actions (list workflow runs, list jobs, get job logs).
|
||||
- Manage issues and pull requests (list, get details, list comments).
|
||||
|
@ -61,7 +61,7 @@ The Forgejo tool can be configured via environment variables or the `config.yaml
|
|||
|
||||
### Environment Variables
|
||||
|
||||
- `FORGEJO_BASE_URL`: The base URL of your Forgejo or Gitea instance (e.g., `https://git.example.com`).
|
||||
- `FORGEJO_BASE_URL`: The base URL of your Forgejo or Gitea instance (e.g. `https://git.example.com`).
|
||||
- `FORGEJO_API_KEY`: Your personal access token for authentication.
|
||||
|
||||
### `config.yaml`
|
||||
|
@ -170,7 +170,7 @@ These endpoints handle repository-level actions.
|
|||
It can be any text or binary data.
|
||||
```
|
||||
|
||||
- **Example Response (for a binary file, e.g., an image):**
|
||||
- **Example Response (for a binary file, e.g. an image):**
|
||||
|
||||
```
|
||||
[Binary content of the image file]
|
||||
|
|
|
@ -6,7 +6,7 @@ compatibility.
|
|||
## Overview
|
||||
|
||||
The SearXNG tool acts as a proxy server for SearXNG search instances, providing search capabilities
|
||||
within the OpenAPI MCP Server framework. It offers a standardized API interface for performing web
|
||||
within the OpenAPI MCP Server framework. It offers a standardised API interface for performing web
|
||||
searches across multiple search engines through SearXNG.
|
||||
|
||||
## Endpoints
|
||||
|
|
|
@ -16,7 +16,7 @@ current time retrieval, timezone conversions, Unix timestamp conversion, and tim
|
|||
| `POST` | `/time/unix_to_iso` | Convert Unix epoch timestamp to ISO format. Accepts Unix timestamp (float) and optional target timezone. |
|
||||
| `POST` | `/time/convert_time` | Convert timestamp from one timezone to another. Requires timestamp, source timezone, and target timezone. |
|
||||
| `POST` | `/time/elapsed_time` | Calculate time difference between two timestamps. Returns difference in specified units (seconds, minutes, hours, days). |
|
||||
| `POST` | `/time/parse_timestamp` | Parse flexible human-readable timestamps (e.g. 'June 1st 2024 3:30 PM', 'tomorrow at noon', '2024-06-01 15:30') into standardized UTC ISO format. |
|
||||
| `POST` | `/time/parse_timestamp` | Parse flexible human-readable timestamps (e.g. 'June 1st 2024 3:30 PM', 'tomorrow at noon', '2024-06-01 15:30') into standardised UTC ISO format. |
|
||||
| `GET` | `/time/list_time_zones` | Get list of all valid IANA timezone names for use with other endpoints. |
|
||||
|
||||
## Key Features
|
||||
|
@ -25,7 +25,7 @@ current time retrieval, timezone conversions, Unix timestamp conversion, and tim
|
|||
- **Timezone Aware**: Full IANA timezone support with automatic conversions
|
||||
- **Unix Compatibility**: Convert Unix epoch timestamps to readable ISO format
|
||||
- **Time Calculations**: Calculate elapsed time between any two timestamps
|
||||
- **Standardized Output**: All timestamps returned in ISO 8601 format
|
||||
- **Standardised Output**: All timestamps returned in ISO 8601 format
|
||||
|
||||
## Usage
|
||||
|
||||
|
|
|
@ -30,5 +30,5 @@ available under the `/web` prefix when the server is running. Visit `/docs` for
|
|||
|
||||
## Technology
|
||||
|
||||
Built using trafilatura, a Python library specialized in extracting and processing web content,
|
||||
Built using trafilatura, a Python library specialised in extracting and processing web content,
|
||||
ensuring high-quality text extraction from web pages.
|
||||
|
|
|
@ -41,7 +41,7 @@ class AppConfig:
|
|||
memory_file_path: str
|
||||
searxng_base_url: str
|
||||
default_timezone: str
|
||||
default_location: str
|
||||
default_location: str | None
|
||||
web_user_agent: str
|
||||
|
||||
def __new__(cls) -> Self:
|
||||
|
@ -55,10 +55,10 @@ class AppConfig:
|
|||
"""
|
||||
if cls._instance is None:
|
||||
cls._instance = super().__new__(cls)
|
||||
cls._instance._initialize() # noqa: SLF001
|
||||
cls._instance._initialise() # noqa: SLF001
|
||||
return cls._instance
|
||||
|
||||
def _initialize(self) -> None:
|
||||
def _initialise(self) -> None:
|
||||
"""Initialises the configuration by loading it from `config.yaml`.
|
||||
|
||||
This method reads the YAML file and populates the configuration
|
||||
|
@ -73,19 +73,14 @@ class AppConfig:
|
|||
self.default_location = "London, UK" # Default value
|
||||
self.web_user_agent = "Mozilla/5.0 (compatible; OpenAPI-MCP-Server/1.0)" # Default value
|
||||
config_path = Path("config.yaml")
|
||||
print(f"DEBUG: _initialize - config_path: {config_path}")
|
||||
print(f"DEBUG: _initialize - config_path.exists(): {config_path.exists()}")
|
||||
if config_path.exists():
|
||||
with config_path.open(encoding="utf-8") as f:
|
||||
config_data = yaml.safe_load(f)
|
||||
print(f"DEBUG: _initialize - config_data: {config_data}")
|
||||
if config_data:
|
||||
self.tools = config_data.get("app", {}).get("tools", {})
|
||||
# Extract forgejo tokens specifically
|
||||
forgejo_tool_config = self.tools.get("forgejo", {})
|
||||
print(f"DEBUG: _initialize - forgejo_tool_config: {forgejo_tool_config}")
|
||||
self.forgejo_tokens = forgejo_tool_config.get("tokens", {})
|
||||
print(f"DEBUG: _initialize - self.forgejo_tokens: {self.forgejo_tokens}")
|
||||
# Extract memory file path
|
||||
memory_tool_config = self.tools.get("memory", {})
|
||||
self.memory_file_path = memory_tool_config.get(
|
||||
|
|
|
@ -17,7 +17,7 @@ class HealthResponse(BaseModel):
|
|||
This model is used to indicate the health status of a service.
|
||||
|
||||
Attributes:
|
||||
status: The health status of the service (e.g., 'healthy').
|
||||
status: The health status of the service (e.g. 'healthy').
|
||||
service: The name of the service being checked.
|
||||
"""
|
||||
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
"""Pydantic models for Forgejo API responses.
|
||||
"""Pydantic models for Forgejo API requests and responses.
|
||||
|
||||
Contains data models representing Forgejo entities: users, repositories, branches,
|
||||
files, commits, workflow runs and jobs, issues, pull requests, comments, labels,
|
||||
|
@ -16,52 +16,13 @@ from datetime import datetime # noqa: TC003
|
|||
from pydantic import BaseModel, Field
|
||||
|
||||
|
||||
class ForgejoInstance(BaseModel):
|
||||
"""Represents a configured Forgejo instance."""
|
||||
# Base entity models (alphabetical order)
|
||||
class ForgejoBranch(BaseModel):
|
||||
"""Represents a Forgejo branch."""
|
||||
|
||||
base_url: str = Field(description="The base URL of the Forgejo instance.")
|
||||
|
||||
|
||||
class ForgejoVersion(BaseModel):
|
||||
"""Represents the version information of a Forgejo instance."""
|
||||
|
||||
version: str = Field(description="The version string of the Forgejo instance.")
|
||||
|
||||
|
||||
class ForgejoUser(BaseModel):
|
||||
"""Represents a Forgejo user."""
|
||||
|
||||
id: int = Field(description="The user's ID.")
|
||||
login: str = Field(description="The user's login name.")
|
||||
|
||||
|
||||
class ForgejoRepository(BaseModel):
|
||||
"""Represents a Forgejo repository."""
|
||||
|
||||
id: int = Field(description="The repository's ID.")
|
||||
name: str = Field(description="The repository's name.")
|
||||
full_name: str = Field(description="The full name of the repository (owner/name).")
|
||||
owner: ForgejoUser = Field(description="The owner of the repository.")
|
||||
html_url: str = Field(description="The URL to the repository's page.")
|
||||
description: str | None = Field(description="The repository's description.")
|
||||
fork: bool = Field(description="Whether the repository is a fork.")
|
||||
empty: bool = Field(description="Whether the repository is empty.")
|
||||
private: bool = Field(description="Whether the repository is private.")
|
||||
archived: bool = Field(description="Whether the repository is archived.")
|
||||
mirror: bool = Field(description="Whether the repository is a mirror.")
|
||||
size: int = Field(description="Size of the repository in kilobytes.")
|
||||
created_at: datetime = Field(description="The creation time of the repository.")
|
||||
updated_at: datetime = Field(description="The last update time of the repository.")
|
||||
pushed_at: datetime | None = Field(description="The last push time of the repository.")
|
||||
default_branch: str = Field(description="The name of the default branch.")
|
||||
|
||||
|
||||
class ForgejoCommitUser(BaseModel):
|
||||
"""Represents a commit author or committer."""
|
||||
|
||||
name: str = Field(description="The name of the commit user.")
|
||||
email: str = Field(description="The email of the commit user.")
|
||||
username: str | None = Field(default=None, description="The username of the commit user.")
|
||||
name: str = Field(description="The name of the branch.")
|
||||
commit: ForgejoCommit = Field(description="The latest commit on the branch.")
|
||||
protected: bool = Field(description="Whether the branch is protected.")
|
||||
|
||||
|
||||
class ForgejoCommit(BaseModel):
|
||||
|
@ -75,12 +36,12 @@ class ForgejoCommit(BaseModel):
|
|||
timestamp: datetime = Field(description="The timestamp of the commit.")
|
||||
|
||||
|
||||
class ForgejoBranch(BaseModel):
|
||||
"""Represents a Forgejo branch."""
|
||||
class ForgejoCommitUser(BaseModel):
|
||||
"""Represents a commit author or committer."""
|
||||
|
||||
name: str = Field(description="The name of the branch.")
|
||||
commit: ForgejoCommit = Field(description="The latest commit on the branch.")
|
||||
protected: bool = Field(description="Whether the branch is protected.")
|
||||
name: str = Field(description="The name of the commit user.")
|
||||
email: str = Field(description="The email of the commit user.")
|
||||
username: str | None = Field(default=None, description="The username of the commit user.")
|
||||
|
||||
|
||||
class ForgejoFile(BaseModel):
|
||||
|
@ -89,9 +50,9 @@ class ForgejoFile(BaseModel):
|
|||
name: str = Field(description="The name of the file or directory.")
|
||||
path: str = Field(description="The path of the file or directory within the repository.")
|
||||
sha: str = Field(description="The SHA of the file or directory.")
|
||||
type: str = Field(description='The type of the entry (e.g., "file", "dir").')
|
||||
type: str = Field(description='The type of the entry (e.g. "file", "dir").')
|
||||
size: int = Field(description="The size of the file in bytes.")
|
||||
encoding: str | None = Field(description='The encoding of the file content (e.g., "base64").')
|
||||
encoding: str | None = Field(description='The encoding of the file content (e.g. "base64").')
|
||||
content: str | None = Field(description="The Base64 encoded content of the file.")
|
||||
url: str = Field(description="The API URL to the file or directory.")
|
||||
html_url: str | None = Field(description="The HTML URL to the file or directory.")
|
||||
|
@ -99,94 +60,10 @@ class ForgejoFile(BaseModel):
|
|||
download_url: str | None = Field(description="The download URL for the file.")
|
||||
|
||||
|
||||
class ForgejoWorkflowRun(BaseModel):
|
||||
"""Represents a Forgejo Actions workflow run."""
|
||||
class ForgejoInstance(BaseModel):
|
||||
"""Represents a configured Forgejo instance."""
|
||||
|
||||
id: int = Field(description="The ID of the workflow run.")
|
||||
name: str = Field(description="The name of the workflow.")
|
||||
head_branch: str = Field(description="The branch the workflow run was triggered on.")
|
||||
head_sha: str = Field(description="The SHA of the head commit.")
|
||||
url: str = Field(description="The API URL to the workflow run.")
|
||||
html_url: str = Field(description="The HTML URL to the workflow run.")
|
||||
status: str = Field(
|
||||
description='The status of the workflow run (e.g., "completed", "in_progress").'
|
||||
)
|
||||
conclusion: str | None = Field(
|
||||
description='The conclusion of the workflow run (e.g., "success", "failure").'
|
||||
)
|
||||
created_at: datetime = Field(description="The creation time of the workflow run.")
|
||||
updated_at: datetime = Field(description="The last update time of the workflow run.")
|
||||
run_number: int = Field(description="The run number of the workflow.")
|
||||
head_commit: ForgejoCommit | None = Field(description="The head commit of the workflow run.")
|
||||
|
||||
|
||||
class ForgejoWorkflowStep(BaseModel):
|
||||
"""Represents a step within a Forgejo Actions workflow job."""
|
||||
|
||||
name: str = Field(description="The name of the step.")
|
||||
status: str = Field(description='The status of the step (e.g., "completed").')
|
||||
conclusion: str | None = Field(description='The conclusion of the step (e.g., "success").')
|
||||
number: int = Field(description="The step number.")
|
||||
started_at: datetime | None = Field(description="The start time of the step.")
|
||||
completed_at: datetime | None = Field(description="The completion time of the step.")
|
||||
|
||||
|
||||
class ForgejoWorkflowJob(BaseModel):
|
||||
"""Represents a Forgejo Actions workflow job."""
|
||||
|
||||
id: int = Field(description="The ID of the job.")
|
||||
run_id: int = Field(description="The ID of the workflow run this job belongs to.")
|
||||
run_url: str = Field(description="The API URL to the workflow run.")
|
||||
head_sha: str = Field(description="The SHA of the head commit.")
|
||||
url: str = Field(description="The API URL to the job.")
|
||||
html_url: str = Field(description="The HTML URL to the job.")
|
||||
status: str = Field(description='The status of the job (e.g., "completed").')
|
||||
conclusion: str | None = Field(description='The conclusion of the job (e.g., "success").')
|
||||
started_at: datetime = Field(description="The start time of the job.")
|
||||
completed_at: datetime | None = Field(description="The completion time of the job.")
|
||||
name: str = Field(description="The name of the job.")
|
||||
steps: list[ForgejoWorkflowStep] | None = Field(description="The steps within the job.")
|
||||
runner_id: int | None = Field(description="The ID of the runner that executed the job.")
|
||||
runner_name: str | None = Field(description="The name of the runner that executed the job.")
|
||||
runner_group_id: int | None = Field(description="The ID of the runner group.")
|
||||
runner_group_name: str | None = Field(description="The name of the runner group.")
|
||||
|
||||
|
||||
class ForgejoJobLog(BaseModel):
|
||||
"""Represents the log content of a Forgejo Actions job."""
|
||||
|
||||
log_content: str = Field(description="The raw log content of the job.")
|
||||
|
||||
|
||||
class ForgejoLabel(BaseModel):
|
||||
"""Represents a label associated with an issue or pull request."""
|
||||
|
||||
id: int = Field(description="The ID of the label.")
|
||||
name: str = Field(description="The name of the label.")
|
||||
color: str = Field(description="The color of the label (hex code).")
|
||||
description: str | None = Field(description="The description of the label.")
|
||||
|
||||
|
||||
class ForgejoMilestone(BaseModel):
|
||||
"""Represents a milestone for issues or pull requests."""
|
||||
|
||||
id: int = Field(description="The ID of the milestone.")
|
||||
title: str = Field(description="The title of the milestone.")
|
||||
description: str | None = Field(description="The description of the milestone.")
|
||||
state: str = Field(description='The state of the milestone (e.g., "open", "closed").')
|
||||
open_issues: int = Field(description="The number of open issues in the milestone.")
|
||||
closed_issues: int = Field(description="The number of closed issues in the milestone.")
|
||||
created_at: datetime = Field(description="The creation time of the milestone.")
|
||||
updated_at: datetime = Field(description="The last update time of the milestone.")
|
||||
closed_at: datetime | None = Field(description="The closing time of the milestone.")
|
||||
due_on: datetime | None = Field(description="The due date of the milestone.")
|
||||
|
||||
|
||||
class ForgejoPullRequestMeta(BaseModel):
|
||||
"""Represents metadata for a pull request associated with an issue."""
|
||||
|
||||
merged: bool = Field(description="Whether the pull request has been merged.")
|
||||
merged_at: datetime | None = Field(description="The time the pull request was merged.")
|
||||
base_url: str = Field(description="The base URL of the Forgejo instance.")
|
||||
|
||||
|
||||
class ForgejoIssue(BaseModel):
|
||||
|
@ -197,7 +74,7 @@ class ForgejoIssue(BaseModel):
|
|||
html_url: str = Field(description="The HTML URL to the issue.")
|
||||
title: str = Field(description="The title of the issue.")
|
||||
body: str | None = Field(description="The body (description) of the issue.")
|
||||
state: str = Field(description='The state of the issue (e.g., "open", "closed").')
|
||||
state: str = Field(description='The state of the issue (e.g. "open", "closed").')
|
||||
comments: int = Field(description="The number of comments on the issue.")
|
||||
created_at: datetime = Field(description="The creation time of the issue.")
|
||||
updated_at: datetime = Field(description="The last update time of the issue.")
|
||||
|
@ -224,10 +101,40 @@ class ForgejoIssueComment(BaseModel):
|
|||
issue_url: str = Field(description="The API URL to the issue the comment belongs to.")
|
||||
|
||||
|
||||
class ForgejoJobLog(BaseModel):
|
||||
"""Represents the log content of a Forgejo Actions job."""
|
||||
|
||||
log_content: str = Field(description="The raw log content of the job.")
|
||||
|
||||
|
||||
class ForgejoLabel(BaseModel):
|
||||
"""Represents a label associated with an issue or pull request."""
|
||||
|
||||
id: int = Field(description="The ID of the label.")
|
||||
name: str = Field(description="The name of the label.")
|
||||
color: str = Field(description="The color of the label (hex code).")
|
||||
description: str | None = Field(description="The description of the label.")
|
||||
|
||||
|
||||
class ForgejoMilestone(BaseModel):
|
||||
"""Represents a milestone for issues or pull requests."""
|
||||
|
||||
id: int = Field(description="The ID of the milestone.")
|
||||
title: str = Field(description="The title of the milestone.")
|
||||
description: str | None = Field(description="The description of the milestone.")
|
||||
state: str = Field(description='The state of the milestone (e.g. "open", "closed").')
|
||||
open_issues: int = Field(description="The number of open issues in the milestone.")
|
||||
closed_issues: int = Field(description="The number of closed issues in the milestone.")
|
||||
created_at: datetime = Field(description="The creation time of the milestone.")
|
||||
updated_at: datetime = Field(description="The last update time of the milestone.")
|
||||
closed_at: datetime | None = Field(description="The closing time of the milestone.")
|
||||
due_on: datetime | None = Field(description="The due date of the milestone.")
|
||||
|
||||
|
||||
class ForgejoPRBranchInfo(BaseModel):
|
||||
"""Represents branch information for a pull request head or base."""
|
||||
|
||||
label: str = Field(description="The label of the branch (e.g., 'owner:branch-name').")
|
||||
label: str = Field(description="The label of the branch (e.g. 'owner:branch-name').")
|
||||
ref: str = Field(description="The name of the branch.")
|
||||
sha: str = Field(description="The SHA of the latest commit on the branch.")
|
||||
repo_id: int = Field(description="The ID of the repository.")
|
||||
|
@ -246,7 +153,7 @@ class ForgejoPullRequest(ForgejoIssue):
|
|||
mergeable: bool = Field(description="Whether the pull request is mergeable.")
|
||||
merged: bool = Field(description="Whether the pull request has been merged.")
|
||||
status: int = Field(
|
||||
description="The status of the pull request (e.g., 1 for open, 2 for closed, 3 for merged)."
|
||||
description="The status of the pull request (e.g. 1 for open, 2 for closed, 3 for merged)."
|
||||
)
|
||||
diff_url: str = Field(description="The URL to the diff of the pull request.")
|
||||
patch_url: str = Field(description="The URL to the patch of the pull request.")
|
||||
|
@ -276,6 +183,224 @@ class ForgejoPullRequestComment(BaseModel):
|
|||
resolver: ForgejoUser | None = Field(description="The user who resolved the comment.")
|
||||
|
||||
|
||||
class ForgejoPullRequestMeta(BaseModel):
|
||||
"""Represents metadata for a pull request associated with an issue."""
|
||||
|
||||
merged: bool = Field(description="Whether the pull request has been merged.")
|
||||
merged_at: datetime | None = Field(description="The time the pull request was merged.")
|
||||
|
||||
|
||||
class ForgejoRepository(BaseModel):
|
||||
"""Represents a Forgejo repository."""
|
||||
|
||||
id: int = Field(description="The repository's ID.")
|
||||
name: str = Field(description="The repository's name.")
|
||||
full_name: str = Field(description="The full name of the repository (owner/name).")
|
||||
owner: ForgejoUser = Field(description="The owner of the repository.")
|
||||
html_url: str = Field(description="The URL to the repository's page.")
|
||||
description: str | None = Field(description="The repository's description.")
|
||||
fork: bool = Field(description="Whether the repository is a fork.")
|
||||
empty: bool = Field(description="Whether the repository is empty.")
|
||||
private: bool = Field(description="Whether the repository is private.")
|
||||
archived: bool = Field(description="Whether the repository is archived.")
|
||||
mirror: bool = Field(description="Whether the repository is a mirror.")
|
||||
size: int = Field(description="Size of the repository in kilobytes.")
|
||||
created_at: datetime = Field(description="The creation time of the repository.")
|
||||
updated_at: datetime = Field(description="The last update time of the repository.")
|
||||
pushed_at: datetime | None = Field(description="The last push time of the repository.")
|
||||
default_branch: str = Field(description="The name of the default branch.")
|
||||
|
||||
|
||||
class ForgejoUser(BaseModel):
|
||||
"""Represents a Forgejo user."""
|
||||
|
||||
id: int = Field(description="The user's ID.")
|
||||
login: str = Field(description="The user's login name.")
|
||||
|
||||
|
||||
class ForgejoVersion(BaseModel):
|
||||
"""Represents the version information of a Forgejo instance."""
|
||||
|
||||
version: str = Field(description="The version string of the Forgejo instance.")
|
||||
|
||||
|
||||
class ForgejoWorkflowJob(BaseModel):
|
||||
"""Represents a Forgejo Actions workflow job."""
|
||||
|
||||
id: int = Field(description="The ID of the job.")
|
||||
run_id: int = Field(description="The ID of the workflow run this job belongs to.")
|
||||
run_url: str = Field(description="The API URL to the workflow run.")
|
||||
head_sha: str = Field(description="The SHA of the head commit.")
|
||||
url: str = Field(description="The API URL to the job.")
|
||||
html_url: str = Field(description="The HTML URL to the job.")
|
||||
status: str = Field(description='The status of the job (e.g. "completed").')
|
||||
conclusion: str | None = Field(description='The conclusion of the job (e.g. "success").')
|
||||
started_at: datetime = Field(description="The start time of the job.")
|
||||
completed_at: datetime | None = Field(description="The completion time of the job.")
|
||||
name: str = Field(description="The name of the job.")
|
||||
steps: list[ForgejoWorkflowStep] | None = Field(description="The steps within the job.")
|
||||
runner_id: int | None = Field(description="The ID of the runner that executed the job.")
|
||||
runner_name: str | None = Field(description="The name of the runner that executed the job.")
|
||||
runner_group_id: int | None = Field(description="The ID of the runner group.")
|
||||
runner_group_name: str | None = Field(description="The name of the runner group.")
|
||||
|
||||
|
||||
class ForgejoWorkflowRun(BaseModel):
|
||||
"""Represents a Forgejo Actions workflow run."""
|
||||
|
||||
id: int = Field(description="The ID of the workflow run.")
|
||||
name: str = Field(description="The name of the workflow.")
|
||||
head_branch: str = Field(description="The branch the workflow run was triggered on.")
|
||||
head_sha: str = Field(description="The SHA of the head commit.")
|
||||
url: str = Field(description="The API URL to the workflow run.")
|
||||
html_url: str = Field(description="The HTML URL to the workflow run.")
|
||||
status: str = Field(
|
||||
description='The status of the workflow run (e.g. "completed", "in_progress").'
|
||||
)
|
||||
conclusion: str | None = Field(
|
||||
description='The conclusion of the workflow run (e.g. "success", "failure").'
|
||||
)
|
||||
created_at: datetime = Field(description="The creation time of the workflow run.")
|
||||
updated_at: datetime = Field(description="The last update time of the workflow run.")
|
||||
run_number: int = Field(description="The run number of the workflow.")
|
||||
head_commit: ForgejoCommit | None = Field(description="The head commit of the workflow run.")
|
||||
|
||||
|
||||
class ForgejoWorkflowStep(BaseModel):
|
||||
"""Represents a step within a Forgejo Actions workflow job."""
|
||||
|
||||
name: str = Field(description="The name of the step.")
|
||||
status: str = Field(description='The status of the step (e.g. "completed").')
|
||||
conclusion: str | None = Field(description='The conclusion of the step (e.g. "success").')
|
||||
number: int = Field(description="The step number.")
|
||||
started_at: datetime | None = Field(description="The start time of the step.")
|
||||
completed_at: datetime | None = Field(description="The completion time of the step.")
|
||||
|
||||
|
||||
# Request models for POST endpoints
|
||||
class ListInstancesRequest(BaseModel):
|
||||
"""Request model for listing Forgejo instances."""
|
||||
|
||||
# Empty request model for consistency
|
||||
|
||||
|
||||
class GetVersionRequest(BaseModel):
|
||||
"""Request model for getting Forgejo version."""
|
||||
|
||||
# Empty request model for consistency
|
||||
|
||||
|
||||
class ListBranchesRequest(BaseModel):
|
||||
"""Request model for listing repository branches."""
|
||||
|
||||
owner: str = Field(description="The owner of the repository.")
|
||||
repo: str = Field(description="The name of the repository.")
|
||||
|
||||
|
||||
class ListFilesRequest(BaseModel):
|
||||
"""Request model for listing files in a directory."""
|
||||
|
||||
owner: str = Field(description="The owner of the repository.")
|
||||
repo: str = Field(description="The name of the repository.")
|
||||
filepath: str = Field(description="The path to list files from.")
|
||||
ref: str | None = Field(default=None, description="The git reference (branch/tag/commit).")
|
||||
|
||||
|
||||
class ListCommitsRequest(BaseModel):
|
||||
"""Request model for listing repository commits."""
|
||||
|
||||
owner: str = Field(description="The owner of the repository.")
|
||||
repo: str = Field(description="The name of the repository.")
|
||||
sha: str | None = Field(
|
||||
default=None, description="SHA or branch to start listing commits from."
|
||||
)
|
||||
|
||||
|
||||
class ListWorkflowRunsRequest(BaseModel):
|
||||
"""Request model for listing workflow runs."""
|
||||
|
||||
owner: str = Field(description="The owner of the repository.")
|
||||
repo: str = Field(description="The name of the repository.")
|
||||
|
||||
|
||||
class ListWorkflowJobsRequest(BaseModel):
|
||||
"""Request model for listing workflow jobs."""
|
||||
|
||||
owner: str = Field(description="The owner of the repository.")
|
||||
repo: str = Field(description="The name of the repository.")
|
||||
run_id: int = Field(description="The ID of the workflow run.")
|
||||
|
||||
|
||||
class GetJobLogRequest(BaseModel):
|
||||
"""Request model for getting job logs."""
|
||||
|
||||
owner: str = Field(description="The owner of the repository.")
|
||||
repo: str = Field(description="The name of the repository.")
|
||||
job_id: int = Field(description="The ID of the workflow job.")
|
||||
|
||||
|
||||
class ListIssuesRequest(BaseModel):
|
||||
"""Request model for listing issues."""
|
||||
|
||||
owner: str = Field(description="The owner of the repository.")
|
||||
repo: str = Field(description="The name of the repository.")
|
||||
state: str | None = Field(default=None, description="State filter (open, closed, all).")
|
||||
|
||||
|
||||
class GetIssueRequest(BaseModel):
|
||||
"""Request model for getting a specific issue."""
|
||||
|
||||
owner: str = Field(description="The owner of the repository.")
|
||||
repo: str = Field(description="The name of the repository.")
|
||||
index: int = Field(description="The issue number.")
|
||||
|
||||
|
||||
class ListIssueCommentsRequest(BaseModel):
|
||||
"""Request model for listing issue comments."""
|
||||
|
||||
owner: str = Field(description="The owner of the repository.")
|
||||
repo: str = Field(description="The name of the repository.")
|
||||
index: int = Field(description="The issue number.")
|
||||
|
||||
|
||||
class ListPullRequestsRequest(BaseModel):
|
||||
"""Request model for listing pull requests."""
|
||||
|
||||
owner: str = Field(description="The owner of the repository.")
|
||||
repo: str = Field(description="The name of the repository.")
|
||||
state: str | None = Field(default=None, description="State filter (open, closed, all).")
|
||||
|
||||
|
||||
class GetPullRequestRequest(BaseModel):
|
||||
"""Request model for getting a specific pull request."""
|
||||
|
||||
owner: str = Field(description="The owner of the repository.")
|
||||
repo: str = Field(description="The name of the repository.")
|
||||
index: int = Field(description="The pull request number.")
|
||||
|
||||
|
||||
class ListPullRequestCommentsRequest(BaseModel):
|
||||
"""Request model for listing pull request comments."""
|
||||
|
||||
owner: str = Field(description="The owner of the repository.")
|
||||
repo: str = Field(description="The name of the repository.")
|
||||
index: int = Field(description="The pull request number.")
|
||||
|
||||
|
||||
class SearchReposRequest(BaseModel):
|
||||
"""Request model for searching repositories."""
|
||||
|
||||
q: str | None = Field(default=None, description="Search query.")
|
||||
|
||||
|
||||
class GetRawFileRequest(BaseModel):
|
||||
"""Request model for getting raw file content."""
|
||||
|
||||
owner: str = Field(description="The owner of the repository.")
|
||||
repo: str = Field(description="The name of the repository.")
|
||||
filepath: str = Field(description="The path to the file.")
|
||||
|
||||
|
||||
# Rebuild models to resolve forward references
|
||||
ForgejoRepository.model_rebuild()
|
||||
ForgejoCommit.model_rebuild()
|
||||
|
|
517
openapi_mcp_server/tools/forgejo/responses.py
Normal file
517
openapi_mcp_server/tools/forgejo/responses.py
Normal file
|
@ -0,0 +1,517 @@
|
|||
"""This module contains response generation logic for the Forgejo tool.
|
||||
|
||||
It handles the core functionality of interacting with Forgejo instances,
|
||||
including repository operations, branch management, file operations, workflow
|
||||
management, issue tracking, and pull request management.
|
||||
|
||||
All functions are designed to be imported and used by the routes module while
|
||||
keeping the route definitions separate from the business logic.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import httpx
|
||||
from fastapi import HTTPException, Response
|
||||
|
||||
from openapi_mcp_server.core.config import get_forgejo_config
|
||||
from openapi_mcp_server.tools.forgejo.models import (
|
||||
ForgejoBranch,
|
||||
ForgejoCommit,
|
||||
ForgejoFile,
|
||||
ForgejoInstance,
|
||||
ForgejoIssue,
|
||||
ForgejoIssueComment,
|
||||
ForgejoJobLog,
|
||||
ForgejoPullRequest,
|
||||
ForgejoPullRequestComment,
|
||||
ForgejoRepository,
|
||||
ForgejoVersion,
|
||||
ForgejoWorkflowJob,
|
||||
ForgejoWorkflowRun,
|
||||
GetIssueRequest,
|
||||
GetJobLogRequest,
|
||||
GetPullRequestRequest,
|
||||
GetRawFileRequest,
|
||||
GetVersionRequest,
|
||||
ListBranchesRequest,
|
||||
ListCommitsRequest,
|
||||
ListFilesRequest,
|
||||
ListInstancesRequest,
|
||||
ListIssueCommentsRequest,
|
||||
ListIssuesRequest,
|
||||
ListPullRequestCommentsRequest,
|
||||
ListPullRequestsRequest,
|
||||
ListWorkflowJobsRequest,
|
||||
ListWorkflowRunsRequest,
|
||||
SearchReposRequest,
|
||||
)
|
||||
|
||||
|
||||
async def get_forgejo_instances(
|
||||
request: ListInstancesRequest | None = None, # noqa: ARG001
|
||||
) -> list[ForgejoInstance]:
|
||||
"""List the Forgejo instances for which this application has authentication tokens.
|
||||
|
||||
Returns:
|
||||
list[ForgejoInstance]: A list of Forgejo instances with configured tokens.
|
||||
"""
|
||||
return [
|
||||
ForgejoInstance(base_url=url) for url in get_forgejo_config().get_configured_instance_urls()
|
||||
]
|
||||
|
||||
|
||||
async def get_root_message() -> dict:
|
||||
"""Root endpoint for the Forgejo tool.
|
||||
|
||||
Returns:
|
||||
dict: A welcome message.
|
||||
"""
|
||||
return {"message": "Forgejo tool is working!"}
|
||||
|
||||
|
||||
async def get_forgejo_version(request: GetVersionRequest | None = None) -> ForgejoVersion: # noqa: ARG001
|
||||
"""Get the version of the Forgejo instance.
|
||||
|
||||
Returns:
|
||||
ForgejoVersion: The version information.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/version", timeout=10.0
|
||||
)
|
||||
response.raise_for_status() # Raise an exception for HTTP errors (4xx or 5xx)
|
||||
return ForgejoVersion(**(await response.json()))
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
|
||||
|
||||
async def get_repository_branches(request: ListBranchesRequest) -> list[ForgejoBranch]:
|
||||
"""List all branches for a repository.
|
||||
|
||||
Returns:
|
||||
list[ForgejoBranch]: A list of branches.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{request.owner}/{request.repo}/branches",
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return [ForgejoBranch(**branch) for branch in await response.json()]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
|
||||
|
||||
async def get_directory_files(request: ListFilesRequest) -> list[ForgejoFile]:
|
||||
"""List the contents of a directory or get a file's content.
|
||||
|
||||
Returns:
|
||||
list[ForgejoFile]: A list of file/directory contents.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
params = {"ref": request.ref} if request.ref else {}
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{request.owner}/{request.repo}/contents/{request.filepath}",
|
||||
params=params,
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
# The API returns a list if it's a directory, or a single dict if it's a file
|
||||
json_response = await response.json()
|
||||
if isinstance(json_response, list):
|
||||
return [ForgejoFile(**item) for item in json_response]
|
||||
return [ForgejoFile(**json_response)]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
|
||||
|
||||
async def get_repository_commits(request: ListCommitsRequest) -> list[ForgejoCommit]:
|
||||
"""List commits for a repository.
|
||||
|
||||
Returns:
|
||||
list[ForgejoCommit]: A list of commits.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
params = {"sha": request.sha} if request.sha else {}
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{request.owner}/{request.repo}/commits",
|
||||
params=params,
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return [ForgejoCommit(**commit) for commit in await response.json()]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
|
||||
|
||||
async def get_workflow_runs(request: ListWorkflowRunsRequest) -> list[ForgejoWorkflowRun]:
|
||||
"""List workflow runs (tasks) for a repository.
|
||||
|
||||
Returns:
|
||||
list[ForgejoWorkflowRun]: A list of workflow runs.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{request.owner}/{request.repo}/actions/tasks",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return [ForgejoWorkflowRun(**run) for run in response.json()]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
|
||||
|
||||
async def get_workflow_jobs(request: ListWorkflowJobsRequest) -> list[ForgejoWorkflowJob]:
|
||||
"""List workflow jobs for a specific workflow run.
|
||||
|
||||
Returns:
|
||||
list[ForgejoWorkflowJob]: A list of workflow jobs.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{request.owner}/{request.repo}/actions/runs/{request.run_id}/jobs",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
# The API returns a dictionary with a "jobs" key
|
||||
return [ForgejoWorkflowJob(**job) for job in (await response.json())["jobs"]]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
|
||||
|
||||
async def get_job_log_content(request: GetJobLogRequest) -> ForgejoJobLog:
|
||||
"""Get the log for a specific workflow job.
|
||||
|
||||
Returns:
|
||||
ForgejoJobLog: The log content of the job.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{request.owner}/{request.repo}/actions/jobs/{request.job_id}/log",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return ForgejoJobLog(log_content=response.text)
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
|
||||
|
||||
async def get_repository_issues(request: ListIssuesRequest) -> list[ForgejoIssue]:
|
||||
"""List issues for a repository.
|
||||
|
||||
Returns:
|
||||
list[ForgejoIssue]: A list of issues.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
params = {"state": request.state} if request.state else {}
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{request.owner}/{request.repo}/issues",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
params=params,
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return [ForgejoIssue(**issue) for issue in await response.json()]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
|
||||
|
||||
async def get_specific_issue(request: GetIssueRequest) -> ForgejoIssue:
|
||||
"""Get a specific issue from a repository.
|
||||
|
||||
Returns:
|
||||
ForgejoIssue: The issue details.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{request.owner}/{request.repo}/issues/{request.index}",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return ForgejoIssue(**(await response.json()))
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
|
||||
|
||||
async def get_issue_comments(request: ListIssueCommentsRequest) -> list[ForgejoIssueComment]:
|
||||
"""List comments for a specific issue.
|
||||
|
||||
Returns:
|
||||
list[ForgejoIssueComment]: A list of issue comments.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{request.owner}/{request.repo}/issues/{request.index}/comments",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return [ForgejoIssueComment(**comment) for comment in await response.json()]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
|
||||
|
||||
async def get_repository_pull_requests(
|
||||
request: ListPullRequestsRequest,
|
||||
) -> list[ForgejoPullRequest]:
|
||||
"""List pull requests for a repository.
|
||||
|
||||
Returns:
|
||||
list[ForgejoPullRequest]: A list of pull requests.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
params = {"state": request.state} if request.state else {}
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{request.owner}/{request.repo}/pulls",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
params=params,
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return [ForgejoPullRequest(**pr) for pr in await response.json()]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
|
||||
|
||||
async def get_specific_pull_request(request: GetPullRequestRequest) -> ForgejoPullRequest:
|
||||
"""Get a specific pull request from a repository.
|
||||
|
||||
Returns:
|
||||
ForgejoPullRequest: The pull request details.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{request.owner}/{request.repo}/pulls/{request.index}",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return ForgejoPullRequest(**(await response.json()))
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
|
||||
|
||||
async def get_pull_request_comments(
|
||||
request: ListPullRequestCommentsRequest,
|
||||
) -> list[ForgejoPullRequestComment]:
|
||||
"""List comments for a specific pull request review.
|
||||
|
||||
Returns:
|
||||
list[ForgejoPullRequestComment]: A list of pull request comments.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{request.owner}/{request.repo}/pulls/{request.index}/reviews",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return [ForgejoPullRequestComment(**comment) for comment in await response.json()]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
|
||||
|
||||
async def search_repositories(request: SearchReposRequest) -> list[ForgejoRepository]:
|
||||
"""Search for repositories.
|
||||
|
||||
Returns:
|
||||
list[ForgejoRepository]: A list of matching repositories.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
params = {"q": request.q} if request.q else {}
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/search",
|
||||
params=params,
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
|
||||
|
||||
async def get_raw_file_content(request: GetRawFileRequest) -> Response:
|
||||
"""Get the raw content of a file from a repository.
|
||||
|
||||
Returns:
|
||||
Response: The raw content of the file.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{request.owner}/{request.repo}/raw/{request.filepath}",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return Response(content=response.content, media_type=response.headers["Content-Type"])
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
|
@ -10,10 +10,8 @@ timeout configurations across all endpoints.
|
|||
|
||||
from __future__ import annotations
|
||||
|
||||
import httpx
|
||||
from fastapi import APIRouter, HTTPException, Response
|
||||
from fastapi import APIRouter, Response
|
||||
|
||||
from openapi_mcp_server.core.config import get_forgejo_config
|
||||
from openapi_mcp_server.tools.base import BaseTool
|
||||
from openapi_mcp_server.tools.forgejo.models import (
|
||||
ForgejoBranch,
|
||||
|
@ -29,502 +27,267 @@ from openapi_mcp_server.tools.forgejo.models import (
|
|||
ForgejoVersion,
|
||||
ForgejoWorkflowJob,
|
||||
ForgejoWorkflowRun,
|
||||
GetIssueRequest,
|
||||
GetJobLogRequest,
|
||||
GetPullRequestRequest,
|
||||
GetRawFileRequest,
|
||||
GetVersionRequest,
|
||||
ListBranchesRequest,
|
||||
ListCommitsRequest,
|
||||
ListFilesRequest,
|
||||
ListInstancesRequest,
|
||||
ListIssueCommentsRequest,
|
||||
ListIssuesRequest,
|
||||
ListPullRequestCommentsRequest,
|
||||
ListPullRequestsRequest,
|
||||
ListWorkflowJobsRequest,
|
||||
ListWorkflowRunsRequest,
|
||||
SearchReposRequest,
|
||||
)
|
||||
|
||||
from .responses import (
|
||||
get_directory_files,
|
||||
get_forgejo_instances,
|
||||
get_forgejo_version,
|
||||
get_issue_comments,
|
||||
get_job_log_content,
|
||||
get_pull_request_comments,
|
||||
get_raw_file_content,
|
||||
get_repository_branches,
|
||||
get_repository_commits,
|
||||
get_repository_issues,
|
||||
get_repository_pull_requests,
|
||||
get_specific_issue,
|
||||
get_specific_pull_request,
|
||||
get_workflow_jobs,
|
||||
get_workflow_runs,
|
||||
search_repositories,
|
||||
)
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
@router.get("/instances", response_model=list[ForgejoInstance])
|
||||
async def list_forgejo_instances() -> list[ForgejoInstance]:
|
||||
@router.post(
|
||||
"/instances",
|
||||
response_model=list[ForgejoInstance],
|
||||
summary="List available Forgejo instances",
|
||||
)
|
||||
async def post_instances(
|
||||
request: ListInstancesRequest | None = None,
|
||||
) -> list[ForgejoInstance]:
|
||||
"""List the Forgejo instances for which this application has authentication tokens.
|
||||
|
||||
Returns:
|
||||
list[ForgejoInstance]: A list of Forgejo instances with configured tokens.
|
||||
"""
|
||||
return [
|
||||
ForgejoInstance(base_url=url) for url in get_forgejo_config().get_configured_instance_urls()
|
||||
]
|
||||
return await get_forgejo_instances(request)
|
||||
|
||||
|
||||
@router.get("/")
|
||||
async def read_root() -> dict:
|
||||
"""Root endpoint for the Forgejo tool.
|
||||
|
||||
Returns:
|
||||
dict: A welcome message.
|
||||
"""
|
||||
return {"message": "Forgejo tool is working!"}
|
||||
|
||||
|
||||
@router.get("/version", response_model=ForgejoVersion)
|
||||
async def get_version() -> ForgejoVersion:
|
||||
@router.post(
|
||||
"/version",
|
||||
response_model=ForgejoVersion,
|
||||
summary="Get Forgejo instance version info",
|
||||
)
|
||||
async def post_version(request: GetVersionRequest | None = None) -> ForgejoVersion:
|
||||
"""Get the version of the Forgejo instance.
|
||||
|
||||
Returns:
|
||||
ForgejoVersion: The version information.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/version", timeout=10.0
|
||||
)
|
||||
response.raise_for_status() # Raise an exception for HTTP errors (4xx or 5xx)
|
||||
return ForgejoVersion(**(await response.json()))
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
return await get_forgejo_version(request)
|
||||
|
||||
|
||||
@router.get("/repos/{owner}/{repo}/branches", response_model=list[ForgejoBranch])
|
||||
async def list_branches(owner: str, repo: str) -> list[ForgejoBranch]:
|
||||
@router.post(
|
||||
"/branches",
|
||||
response_model=list[ForgejoBranch],
|
||||
summary="List repository branches with commit info",
|
||||
)
|
||||
async def post_branches(request: ListBranchesRequest) -> list[ForgejoBranch]:
|
||||
"""List all branches for a repository.
|
||||
|
||||
Returns:
|
||||
list[ForgejoBranch]: A list of branches.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{owner}/{repo}/branches",
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return [ForgejoBranch(**branch) for branch in await response.json()]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
return await get_repository_branches(request)
|
||||
|
||||
|
||||
@router.get("/repos/{owner}/{repo}/contents/{filepath:path}", response_model=list[ForgejoFile])
|
||||
async def list_files_in_directory(
|
||||
owner: str, repo: str, filepath: str, ref: str | None = None
|
||||
) -> list[ForgejoFile]:
|
||||
@router.post(
|
||||
"/files",
|
||||
response_model=list[ForgejoFile],
|
||||
summary="Browse repository files and folders",
|
||||
)
|
||||
async def post_files(request: ListFilesRequest) -> list[ForgejoFile]:
|
||||
"""List the contents of a directory or get a file's content.
|
||||
|
||||
Returns:
|
||||
list[ForgejoFile]: A list of file/directory contents.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
params = {"ref": ref} if ref else {}
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{owner}/{repo}/contents/{filepath}",
|
||||
params=params,
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
# The API returns a list if it's a directory, or a single dict if it's a file
|
||||
json_response = await response.json()
|
||||
if isinstance(json_response, list):
|
||||
return [ForgejoFile(**item) for item in json_response]
|
||||
return [ForgejoFile(**json_response)]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
return await get_directory_files(request)
|
||||
|
||||
|
||||
@router.get("/repos/{owner}/{repo}/commits", response_model=list[ForgejoCommit])
|
||||
async def list_commits(owner: str, repo: str, sha: str | None = None) -> list[ForgejoCommit]:
|
||||
@router.post(
|
||||
"/commits",
|
||||
response_model=list[ForgejoCommit],
|
||||
summary="List commits with author and message info",
|
||||
)
|
||||
async def post_commits(request: ListCommitsRequest) -> list[ForgejoCommit]:
|
||||
"""List commits for a repository.
|
||||
|
||||
Returns:
|
||||
list[ForgejoCommit]: A list of commits.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
params = {"sha": sha} if sha else {}
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{owner}/{repo}/commits",
|
||||
params=params,
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return [ForgejoCommit(**commit) for commit in await response.json()]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
return await get_repository_commits(request)
|
||||
|
||||
|
||||
@router.get("/repos/{owner}/{repo}/actions/tasks", response_model=list[ForgejoWorkflowRun])
|
||||
async def list_workflow_runs(owner: str, repo: str) -> list[ForgejoWorkflowRun]:
|
||||
@router.post(
|
||||
"/workflow-runs",
|
||||
response_model=list[ForgejoWorkflowRun],
|
||||
summary="List CI/CD workflow runs and their status",
|
||||
)
|
||||
async def post_workflow_runs(request: ListWorkflowRunsRequest) -> list[ForgejoWorkflowRun]:
|
||||
"""List workflow runs (tasks) for a repository.
|
||||
|
||||
Returns:
|
||||
list[ForgejoWorkflowRun]: A list of workflow runs.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{owner}/{repo}/actions/tasks",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return [ForgejoWorkflowRun(**run) for run in response.json()]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
return await get_workflow_runs(request)
|
||||
|
||||
|
||||
@router.get(
|
||||
"/repos/{owner}/{repo}/actions/runs/{run_id}/jobs", response_model=list[ForgejoWorkflowJob]
|
||||
@router.post(
|
||||
"/workflow-jobs",
|
||||
response_model=list[ForgejoWorkflowJob],
|
||||
summary="List jobs within a workflow run with status",
|
||||
)
|
||||
async def list_workflow_jobs(owner: str, repo: str, run_id: int) -> list[ForgejoWorkflowJob]:
|
||||
async def post_workflow_jobs(request: ListWorkflowJobsRequest) -> list[ForgejoWorkflowJob]:
|
||||
"""List workflow jobs for a specific workflow run.
|
||||
|
||||
Returns:
|
||||
list[ForgejoWorkflowJob]: A list of workflow jobs.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{owner}/{repo}/actions/runs/{run_id}/jobs",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
# The API returns a dictionary with a "jobs" key
|
||||
return [ForgejoWorkflowJob(**job) for job in (await response.json())["jobs"]]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
return await get_workflow_jobs(request)
|
||||
|
||||
|
||||
@router.get("/repos/{owner}/{repo}/actions/jobs/{job_id}/log", response_model=ForgejoJobLog)
|
||||
async def get_job_log(owner: str, repo: str, job_id: int) -> ForgejoJobLog:
|
||||
@router.post(
|
||||
"/job-log",
|
||||
response_model=ForgejoJobLog,
|
||||
summary="Get CI/CD job logs for debugging",
|
||||
)
|
||||
async def post_job_log(request: GetJobLogRequest) -> ForgejoJobLog:
|
||||
"""Get the log for a specific workflow job.
|
||||
|
||||
Returns:
|
||||
ForgejoJobLog: The log content of the job.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{owner}/{repo}/actions/jobs/{job_id}/log",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return ForgejoJobLog(log_content=response.text)
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
return await get_job_log_content(request)
|
||||
|
||||
|
||||
@router.get("/repos/{owner}/{repo}/issues", response_model=list[ForgejoIssue])
|
||||
async def list_issues(owner: str, repo: str, state: str | None = None) -> list[ForgejoIssue]:
|
||||
@router.post(
|
||||
"/issues",
|
||||
response_model=list[ForgejoIssue],
|
||||
summary="List issues with title, state, and labels",
|
||||
)
|
||||
async def post_issues(request: ListIssuesRequest) -> list[ForgejoIssue]:
|
||||
"""List issues for a repository.
|
||||
|
||||
Returns:
|
||||
list[ForgejoIssue]: A list of issues.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
params = {"state": state} if state else {}
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{owner}/{repo}/issues",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
params=params,
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return [ForgejoIssue(**issue) for issue in await response.json()]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
return await get_repository_issues(request)
|
||||
|
||||
|
||||
@router.get("/repos/{owner}/{repo}/issues/{index}", response_model=ForgejoIssue)
|
||||
async def get_issue(owner: str, repo: str, index: int) -> ForgejoIssue:
|
||||
@router.post("/issue", response_model=ForgejoIssue, summary="Get specific issue details by number")
|
||||
async def post_issue(request: GetIssueRequest) -> ForgejoIssue:
|
||||
"""Get a specific issue from a repository.
|
||||
|
||||
Returns:
|
||||
ForgejoIssue: The issue details.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{owner}/{repo}/issues/{index}",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return ForgejoIssue(**(await response.json()))
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
return await get_specific_issue(request)
|
||||
|
||||
|
||||
@router.get(
|
||||
"/repos/{owner}/{repo}/issues/{index}/comments", response_model=list[ForgejoIssueComment]
|
||||
@router.post(
|
||||
"/issue-comments",
|
||||
response_model=list[ForgejoIssueComment],
|
||||
summary="List issue comments with author and timestamp",
|
||||
)
|
||||
async def list_issue_comments(owner: str, repo: str, index: int) -> list[ForgejoIssueComment]:
|
||||
async def post_issue_comments(request: ListIssueCommentsRequest) -> list[ForgejoIssueComment]:
|
||||
"""List comments for a specific issue.
|
||||
|
||||
Returns:
|
||||
list[ForgejoIssueComment]: A list of issue comments.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{owner}/{repo}/issues/{index}/comments",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return [ForgejoIssueComment(**comment) for comment in await response.json()]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
return await get_issue_comments(request)
|
||||
|
||||
|
||||
@router.get("/repos/{owner}/{repo}/pulls", response_model=list[ForgejoPullRequest])
|
||||
async def list_pull_requests(
|
||||
owner: str, repo: str, state: str | None = None
|
||||
) -> list[ForgejoPullRequest]:
|
||||
@router.post(
|
||||
"/pull-requests",
|
||||
response_model=list[ForgejoPullRequest],
|
||||
summary="List pull requests with title and state",
|
||||
)
|
||||
async def post_pull_requests(request: ListPullRequestsRequest) -> list[ForgejoPullRequest]:
|
||||
"""List pull requests for a repository.
|
||||
|
||||
Returns:
|
||||
list[ForgejoPullRequest]: A list of pull requests.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
params = {"state": state} if state else {}
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{owner}/{repo}/pulls",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
params=params,
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return [ForgejoPullRequest(**pr) for pr in await response.json()]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
return await get_repository_pull_requests(request)
|
||||
|
||||
|
||||
@router.get("/repos/{owner}/{repo}/pulls/{index}", response_model=ForgejoPullRequest)
|
||||
async def get_pull_request(owner: str, repo: str, index: int) -> ForgejoPullRequest:
|
||||
@router.post(
|
||||
"/pull-request",
|
||||
response_model=ForgejoPullRequest,
|
||||
summary="Get specific pull request details by number",
|
||||
)
|
||||
async def post_pull_request(request: GetPullRequestRequest) -> ForgejoPullRequest:
|
||||
"""Get a specific pull request from a repository.
|
||||
|
||||
Returns:
|
||||
ForgejoPullRequest: The pull request details.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{owner}/{repo}/pulls/{index}",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return ForgejoPullRequest(**(await response.json()))
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
return await get_specific_pull_request(request)
|
||||
|
||||
|
||||
@router.get(
|
||||
"/repos/{owner}/{repo}/pulls/{index}/reviews", response_model=list[ForgejoPullRequestComment]
|
||||
@router.post(
|
||||
"/pull-request-comments",
|
||||
response_model=list[ForgejoPullRequestComment],
|
||||
summary="List pull request review comments",
|
||||
)
|
||||
async def list_pull_request_comments(
|
||||
owner: str, repo: str, index: int
|
||||
async def post_pull_request_comments(
|
||||
request: ListPullRequestCommentsRequest,
|
||||
) -> list[ForgejoPullRequestComment]:
|
||||
"""List comments for a specific pull request review.
|
||||
|
||||
Returns:
|
||||
list[ForgejoPullRequestComment]: A list of pull request comments.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{owner}/{repo}/pulls/{index}/reviews",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return [ForgejoPullRequestComment(**comment) for comment in await response.json()]
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
return await get_pull_request_comments(request)
|
||||
|
||||
|
||||
@router.get("/repos/search", response_model=list[ForgejoRepository])
|
||||
async def search_repos(q: str | None = None) -> list[ForgejoRepository]:
|
||||
@router.post(
|
||||
"/search-repos",
|
||||
response_model=list[ForgejoRepository],
|
||||
summary="Search repositories by name or description",
|
||||
)
|
||||
async def post_search_repos(request: SearchReposRequest) -> list[ForgejoRepository]:
|
||||
"""Search for repositories.
|
||||
|
||||
Returns:
|
||||
list[ForgejoRepository]: A list of matching repositories.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
params = {"q": q} if q else {}
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/search",
|
||||
params=params,
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
return await search_repositories(request)
|
||||
|
||||
|
||||
@router.get("/repos/{owner}/{repo}/raw/{filepath:path}")
|
||||
async def get_raw_file(owner: str, repo: str, filepath: str) -> Response:
|
||||
@router.post(
|
||||
"/raw-file",
|
||||
summary="Get raw file content from a repository at a specific path",
|
||||
)
|
||||
async def post_raw_file(request: GetRawFileRequest) -> Response:
|
||||
"""Get the raw content of a file from a repository.
|
||||
|
||||
Returns:
|
||||
Response: The raw content of the file.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request to the Forgejo instance fails.
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
f"{get_forgejo_config().base_url}/api/v1/repos/{owner}/{repo}/raw/{filepath}",
|
||||
headers=get_forgejo_config().get_headers(get_forgejo_config().base_url),
|
||||
timeout=10.0,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return Response(content=response.content, media_type=response.headers["Content-Type"])
|
||||
except httpx.RequestError as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Failed to connect to Forgejo instance: {e}"
|
||||
) from e
|
||||
except httpx.HTTPStatusError as e:
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"Forgejo API returned an error: {e.response.text}",
|
||||
) from e
|
||||
return await get_raw_file_content(request)
|
||||
|
||||
|
||||
class ForgejoTool(BaseTool):
|
||||
|
|
|
@ -10,6 +10,77 @@ from __future__ import annotations
|
|||
from pydantic import BaseModel, Field
|
||||
|
||||
|
||||
class CreateMemoryRequest(BaseModel):
|
||||
"""Represents a request to create a new memory.
|
||||
|
||||
Attributes:
|
||||
content: The detailed information to be stored in the memory.
|
||||
entities: A list of entity names to associate with the memory.
|
||||
"""
|
||||
|
||||
content: str = Field(
|
||||
...,
|
||||
description=(
|
||||
"The detailed fact or information to store. Include comprehensive details, "
|
||||
"context, and specifics for best recall - avoid brief summaries."
|
||||
),
|
||||
)
|
||||
entities: list[str] = Field(..., description="List of entity names this memory references")
|
||||
|
||||
|
||||
class DeleteMemoryRequest(BaseModel):
|
||||
"""Represents a request to delete specific memories.
|
||||
|
||||
Attributes:
|
||||
memory_ids: A list of memory IDs to be deleted.
|
||||
"""
|
||||
|
||||
memory_ids: list[str] = Field(..., description="List of memory IDs to delete")
|
||||
|
||||
|
||||
class Entity(BaseModel):
|
||||
"""Represents an entity that can be associated with memories.
|
||||
|
||||
Attributes:
|
||||
name: The name of the entity.
|
||||
entity_type: The type of the entity (e.g. 'person', 'place').
|
||||
memory_count: The number of memories that reference this entity.
|
||||
"""
|
||||
|
||||
name: str = Field(..., description="The name of the entity")
|
||||
entity_type: str = Field(default="general", description="The type of the entity")
|
||||
memory_count: int = Field(default=0, description="Number of memories referencing this entity")
|
||||
|
||||
|
||||
class GetAllMemoriesRequest(BaseModel):
|
||||
"""Represents a request to retrieve all stored memories.
|
||||
|
||||
Attributes:
|
||||
limit: The maximum number of memories to return.
|
||||
"""
|
||||
|
||||
limit: int = Field(default=20, description="Maximum number of memories to return")
|
||||
|
||||
|
||||
class GetEntityRequest(BaseModel):
|
||||
"""Represents a request to retrieve memories for specific entities.
|
||||
|
||||
Attributes:
|
||||
entities: A list of entity names to retrieve memories for.
|
||||
limit: The maximum number of memories to return per entity.
|
||||
"""
|
||||
|
||||
entities: list[str] = Field(..., description="List of entity names to retrieve memories for")
|
||||
limit: int = Field(default=5, description="Maximum number of memories to return")
|
||||
|
||||
|
||||
class GetMemorySummaryRequest(BaseModel):
|
||||
"""Represents a request to get memory summary statistics.
|
||||
|
||||
This is an empty request model for consistency with POST endpoints.
|
||||
"""
|
||||
|
||||
|
||||
class Memory(BaseModel):
|
||||
"""Represents a single, timestamped memory associated with one or more entities.
|
||||
|
||||
|
@ -31,20 +102,6 @@ class Memory(BaseModel):
|
|||
timestamp: str = Field(..., description="ISO timestamp when this memory was created")
|
||||
|
||||
|
||||
class Entity(BaseModel):
|
||||
"""Represents an entity that can be associated with memories.
|
||||
|
||||
Attributes:
|
||||
name: The name of the entity.
|
||||
entity_type: The type of the entity (e.g., 'person', 'place').
|
||||
memory_count: The number of memories that reference this entity.
|
||||
"""
|
||||
|
||||
name: str = Field(..., description="The name of the entity")
|
||||
entity_type: str = Field(default="general", description="The type of the entity")
|
||||
memory_count: int = Field(default=0, description="Number of memories referencing this entity")
|
||||
|
||||
|
||||
class MemoryGraph(BaseModel):
|
||||
"""Represents a collection of memories and entities.
|
||||
|
||||
|
@ -57,58 +114,6 @@ class MemoryGraph(BaseModel):
|
|||
entities: list[Entity]
|
||||
|
||||
|
||||
class CreateMemoryRequest(BaseModel):
|
||||
"""Represents a request to create a new memory.
|
||||
|
||||
Attributes:
|
||||
content: The detailed information to be stored in the memory.
|
||||
entities: A list of entity names to associate with the memory.
|
||||
"""
|
||||
|
||||
content: str = Field(
|
||||
...,
|
||||
description=(
|
||||
"The detailed fact or information to store. Include comprehensive details, "
|
||||
"context, and specifics for best recall - avoid brief summaries."
|
||||
),
|
||||
)
|
||||
entities: list[str] = Field(..., description="List of entity names this memory references")
|
||||
|
||||
|
||||
class SearchMemoryRequest(BaseModel):
|
||||
"""Represents a request to search for memories.
|
||||
|
||||
Attributes:
|
||||
query: The search term to find within memory content or entity names.
|
||||
limit: The maximum number of memories to return.
|
||||
"""
|
||||
|
||||
query: str = Field(..., description="Search term to find in memory content or entity names")
|
||||
limit: int = Field(default=10, description="Maximum number of memories to return")
|
||||
|
||||
|
||||
class GetEntityRequest(BaseModel):
|
||||
"""Represents a request to retrieve memories for specific entities.
|
||||
|
||||
Attributes:
|
||||
entities: A list of entity names to retrieve memories for.
|
||||
limit: The maximum number of memories to return per entity.
|
||||
"""
|
||||
|
||||
entities: list[str] = Field(..., description="List of entity names to retrieve memories for")
|
||||
limit: int = Field(default=5, description="Maximum number of memories to return")
|
||||
|
||||
|
||||
class DeleteMemoryRequest(BaseModel):
|
||||
"""Represents a request to delete specific memories.
|
||||
|
||||
Attributes:
|
||||
memory_ids: A list of memory IDs to be deleted.
|
||||
"""
|
||||
|
||||
memory_ids: list[str] = Field(..., description="List of memory IDs to delete")
|
||||
|
||||
|
||||
class MemorySummary(BaseModel):
|
||||
"""Represents summary statistics about the stored memories.
|
||||
|
||||
|
@ -129,3 +134,15 @@ class MemorySummary(BaseModel):
|
|||
..., description="Days between oldest and latest memory"
|
||||
)
|
||||
top_entities: list[Entity] = Field(..., description="Most frequently referenced entities")
|
||||
|
||||
|
||||
class SearchMemoryRequest(BaseModel):
|
||||
"""Represents a request to search for memories.
|
||||
|
||||
Attributes:
|
||||
query: The search term to find within memory content or entity names.
|
||||
limit: The maximum number of memories to return.
|
||||
"""
|
||||
|
||||
query: str = Field(..., description="Search term to find in memory content or entity names")
|
||||
limit: int = Field(default=10, description="Maximum number of memories to return")
|
||||
|
|
258
openapi_mcp_server/tools/memory/responses.py
Normal file
258
openapi_mcp_server/tools/memory/responses.py
Normal file
|
@ -0,0 +1,258 @@
|
|||
"""This module contains response generation logic for the Memory tool.
|
||||
|
||||
It handles the core functionality of creating, retrieving, searching, and deleting
|
||||
memories, as well as generating summary statistics for the memory system.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
|
||||
from .models import (
|
||||
CreateMemoryRequest,
|
||||
DeleteMemoryRequest,
|
||||
Entity,
|
||||
GetAllMemoriesRequest,
|
||||
GetEntityRequest,
|
||||
GetMemorySummaryRequest,
|
||||
Memory,
|
||||
MemoryGraph,
|
||||
MemorySummary,
|
||||
SearchMemoryRequest,
|
||||
)
|
||||
from .storage import (
|
||||
generate_memory_id,
|
||||
get_current_timestamp,
|
||||
read_memory_graph,
|
||||
save_memory_graph,
|
||||
)
|
||||
|
||||
|
||||
def create_memory(req: CreateMemoryRequest) -> Memory:
|
||||
"""Stores a new memory or fact in the memory graph.
|
||||
|
||||
This function creates a new memory with a unique ID and timestamp. It also
|
||||
updates the associated entities, creating them if they do not already
|
||||
exist, and increments their memory counts.
|
||||
|
||||
Args:
|
||||
req: A CreateMemoryRequest object containing the content of the memory
|
||||
and the entities it is associated with.
|
||||
|
||||
Returns:
|
||||
The newly created Memory object.
|
||||
"""
|
||||
graph = read_memory_graph()
|
||||
|
||||
# Create new memory with auto-generated timestamp and ID
|
||||
memory = Memory(
|
||||
id=generate_memory_id(),
|
||||
content=req.content,
|
||||
entities=req.entities,
|
||||
timestamp=get_current_timestamp(),
|
||||
)
|
||||
|
||||
graph.memories.append(memory)
|
||||
|
||||
# Update entity counts and ensure entities exist
|
||||
entity_dict = {e.name: e for e in graph.entities}
|
||||
for entity_name in req.entities:
|
||||
if entity_name in entity_dict:
|
||||
entity_dict[entity_name].memory_count += 1
|
||||
else:
|
||||
# Create new entity
|
||||
new_entity = Entity(name=entity_name, entity_type="general", memory_count=1)
|
||||
graph.entities.append(new_entity)
|
||||
entity_dict[entity_name] = new_entity
|
||||
|
||||
save_memory_graph(graph)
|
||||
return memory
|
||||
|
||||
|
||||
def get_all_memories(req: GetAllMemoriesRequest) -> MemoryGraph:
|
||||
"""Retrieves all stored memories and entities.
|
||||
|
||||
The memories are sorted by timestamp in descending order (newest first).
|
||||
|
||||
Args:
|
||||
req: A GetAllMemoriesRequest object containing the limit.
|
||||
|
||||
Returns:
|
||||
A MemoryGraph object containing all stored memories and entities.
|
||||
"""
|
||||
graph = read_memory_graph()
|
||||
# Sort memories by timestamp (newest first)
|
||||
graph.memories.sort(key=lambda m: m.timestamp, reverse=True)
|
||||
if req.limit and len(graph.memories) > req.limit:
|
||||
graph.memories = graph.memories[: req.limit]
|
||||
return graph
|
||||
|
||||
|
||||
def search_memories(req: SearchMemoryRequest) -> MemoryGraph:
|
||||
"""Searches for memories by their content or associated entity names.
|
||||
|
||||
The search is case-insensitive. The results are sorted by timestamp in
|
||||
descending order (newest first).
|
||||
|
||||
Args:
|
||||
req: A SearchMemoryRequest object containing the search query and limit.
|
||||
|
||||
Returns:
|
||||
A MemoryGraph object containing the memories and entities that match
|
||||
the search query.
|
||||
"""
|
||||
graph = read_memory_graph()
|
||||
query = req.query.lower()
|
||||
|
||||
matching_memories = []
|
||||
for memory in graph.memories:
|
||||
# Search in content
|
||||
if query in memory.content.lower():
|
||||
matching_memories.append(memory)
|
||||
continue
|
||||
# Search in entity names
|
||||
if any(query in entity.lower() for entity in memory.entities):
|
||||
matching_memories.append(memory)
|
||||
|
||||
# Sort by timestamp (newest first) and apply limit
|
||||
matching_memories.sort(key=lambda m: m.timestamp, reverse=True)
|
||||
if len(matching_memories) > req.limit:
|
||||
matching_memories = matching_memories[: req.limit]
|
||||
|
||||
# Get entities referenced in matching memories
|
||||
referenced_entities = set()
|
||||
for memory in matching_memories:
|
||||
referenced_entities.update(memory.entities)
|
||||
|
||||
matching_entities = [e for e in graph.entities if e.name in referenced_entities]
|
||||
|
||||
return MemoryGraph(memories=matching_memories, entities=matching_entities)
|
||||
|
||||
|
||||
def get_entity_memories(req: GetEntityRequest) -> MemoryGraph:
|
||||
"""Retrieves all memories associated with specific entities.
|
||||
|
||||
The results are sorted by timestamp in descending order (newest first).
|
||||
|
||||
Args:
|
||||
req: A GetEntityRequest object containing the list of entity names.
|
||||
|
||||
Returns:
|
||||
A MemoryGraph object containing the memories and entities associated
|
||||
with the specified entity names.
|
||||
"""
|
||||
graph = read_memory_graph()
|
||||
|
||||
# Check if memory references any of the requested entities
|
||||
matching_memories = [
|
||||
memory
|
||||
for memory in graph.memories
|
||||
if any(entity in memory.entities for entity in req.entities)
|
||||
]
|
||||
|
||||
# Sort by timestamp (newest first) and apply limit
|
||||
matching_memories.sort(key=lambda m: m.timestamp, reverse=True)
|
||||
if len(matching_memories) > req.limit:
|
||||
matching_memories = matching_memories[: req.limit]
|
||||
|
||||
# Get the requested entities
|
||||
matching_entities = [e for e in graph.entities if e.name in req.entities]
|
||||
|
||||
return MemoryGraph(memories=matching_memories, entities=matching_entities)
|
||||
|
||||
|
||||
def delete_memories(req: DeleteMemoryRequest) -> dict[str, str]:
|
||||
"""Deletes specific memories from the memory graph by their IDs.
|
||||
|
||||
After deleting the memories, this function recalculates the memory counts
|
||||
for all entities and removes any entities that are no longer referenced
|
||||
by any memories.
|
||||
|
||||
Args:
|
||||
req: A DeleteMemoryRequest object containing the list of memory IDs
|
||||
to be deleted.
|
||||
|
||||
Returns:
|
||||
A dictionary with a message indicating how many memories were deleted.
|
||||
"""
|
||||
graph = read_memory_graph()
|
||||
original_count = len(graph.memories)
|
||||
|
||||
# Remove memories and track which entities were affected
|
||||
affected_entities = set()
|
||||
memories_to_keep = []
|
||||
for m in graph.memories:
|
||||
if m.id not in req.memory_ids:
|
||||
memories_to_keep.append(m)
|
||||
else:
|
||||
affected_entities.update(m.entities)
|
||||
graph.memories = memories_to_keep
|
||||
|
||||
deleted_count = original_count - len(graph.memories)
|
||||
|
||||
# Recalculate entity memory counts
|
||||
entity_counts: dict[str, int] = {}
|
||||
for memory in graph.memories:
|
||||
for entity_name in memory.entities:
|
||||
entity_counts[entity_name] = entity_counts.get(entity_name, 0) + 1
|
||||
|
||||
# Update entity counts and remove entities with zero memories
|
||||
graph.entities = [
|
||||
Entity(name=e.name, entity_type=e.entity_type, memory_count=entity_counts.get(e.name, 0))
|
||||
for e in graph.entities
|
||||
if entity_counts.get(e.name, 0) > 0
|
||||
]
|
||||
|
||||
save_memory_graph(graph)
|
||||
return {"message": f"Deleted {deleted_count} memories"}
|
||||
|
||||
|
||||
def get_summary(req: GetMemorySummaryRequest) -> MemorySummary: # noqa: ARG001
|
||||
"""Retrieves a summary of the memory graph statistics.
|
||||
|
||||
This includes the total number of memories and entities, the timestamps of
|
||||
the oldest and latest memories, the timespan in days, and a list of the
|
||||
most frequently referenced entities.
|
||||
|
||||
Args:
|
||||
req: A GetMemorySummaryRequest object (empty, for consistency).
|
||||
|
||||
Returns:
|
||||
A MemorySummary object containing the statistics.
|
||||
"""
|
||||
graph = read_memory_graph()
|
||||
|
||||
if not graph.memories:
|
||||
return MemorySummary(
|
||||
total_memories=0,
|
||||
total_entities=0,
|
||||
oldest_memory=None,
|
||||
latest_memory=None,
|
||||
memory_timespan_days=None,
|
||||
top_entities=[],
|
||||
)
|
||||
|
||||
# Sort memories by timestamp
|
||||
sorted_memories = sorted(graph.memories, key=lambda m: m.timestamp)
|
||||
oldest = sorted_memories[0].timestamp
|
||||
latest = sorted_memories[-1].timestamp
|
||||
|
||||
# Calculate timespan
|
||||
try:
|
||||
oldest_dt = datetime.fromisoformat(oldest)
|
||||
latest_dt = datetime.fromisoformat(latest)
|
||||
timespan_days = (latest_dt - oldest_dt).days
|
||||
except ValueError:
|
||||
timespan_days = None
|
||||
|
||||
# Get top entities by memory count
|
||||
top_entities = sorted(graph.entities, key=lambda e: e.memory_count, reverse=True)[:10]
|
||||
|
||||
return MemorySummary(
|
||||
total_memories=len(graph.memories),
|
||||
total_entities=len(graph.entities),
|
||||
oldest_memory=oldest,
|
||||
latest_memory=latest,
|
||||
memory_timespan_days=timespan_days,
|
||||
top_entities=top_entities,
|
||||
)
|
|
@ -8,8 +8,6 @@ facts associated with various entities.
|
|||
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
|
||||
from fastapi import APIRouter
|
||||
|
||||
from openapi_mcp_server.tools.base import BaseTool
|
||||
|
@ -17,18 +15,21 @@ from openapi_mcp_server.tools.base import BaseTool
|
|||
from .models import (
|
||||
CreateMemoryRequest,
|
||||
DeleteMemoryRequest,
|
||||
Entity,
|
||||
GetAllMemoriesRequest,
|
||||
GetEntityRequest,
|
||||
GetMemorySummaryRequest,
|
||||
Memory,
|
||||
MemoryGraph,
|
||||
MemorySummary,
|
||||
SearchMemoryRequest,
|
||||
)
|
||||
from .storage import (
|
||||
generate_memory_id,
|
||||
get_current_timestamp,
|
||||
read_memory_graph,
|
||||
save_memory_graph,
|
||||
from .responses import (
|
||||
create_memory,
|
||||
delete_memories,
|
||||
get_all_memories,
|
||||
get_entity_memories,
|
||||
get_summary,
|
||||
search_memories,
|
||||
)
|
||||
|
||||
|
||||
|
@ -50,233 +51,6 @@ class MemoryTool(BaseTool):
|
|||
description="A simple memory system for storing timestamped facts about entities",
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def create_memory(req: CreateMemoryRequest) -> Memory:
|
||||
"""Stores a new memory or fact in the memory graph.
|
||||
|
||||
This method creates a new memory with a unique ID and timestamp. It also
|
||||
updates the associated entities, creating them if they do not already
|
||||
exist, and increments their memory counts.
|
||||
|
||||
Args:
|
||||
req: A CreateMemoryRequest object containing the content of the memory
|
||||
and the entities it is associated with.
|
||||
|
||||
Returns:
|
||||
The newly created Memory object.
|
||||
"""
|
||||
graph = read_memory_graph()
|
||||
|
||||
# Create new memory with auto-generated timestamp and ID
|
||||
memory = Memory(
|
||||
id=generate_memory_id(),
|
||||
content=req.content,
|
||||
entities=req.entities,
|
||||
timestamp=get_current_timestamp(),
|
||||
)
|
||||
|
||||
graph.memories.append(memory)
|
||||
|
||||
# Update entity counts and ensure entities exist
|
||||
entity_dict = {e.name: e for e in graph.entities}
|
||||
for entity_name in req.entities:
|
||||
if entity_name in entity_dict:
|
||||
entity_dict[entity_name].memory_count += 1
|
||||
else:
|
||||
# Create new entity
|
||||
new_entity = Entity(name=entity_name, entity_type="general", memory_count=1)
|
||||
graph.entities.append(new_entity)
|
||||
entity_dict[entity_name] = new_entity
|
||||
|
||||
save_memory_graph(graph)
|
||||
return memory
|
||||
|
||||
@staticmethod
|
||||
def get_all_memories(limit: int = 20) -> MemoryGraph:
|
||||
"""Retrieves all stored memories and entities.
|
||||
|
||||
The memories are sorted by timestamp in descending order (newest first).
|
||||
|
||||
Args:
|
||||
limit: The maximum number of memories to return.
|
||||
|
||||
Returns:
|
||||
A MemoryGraph object containing all stored memories and entities.
|
||||
"""
|
||||
graph = read_memory_graph()
|
||||
# Sort memories by timestamp (newest first)
|
||||
graph.memories.sort(key=lambda m: m.timestamp, reverse=True)
|
||||
if limit and len(graph.memories) > limit:
|
||||
graph.memories = graph.memories[:limit]
|
||||
return graph
|
||||
|
||||
@staticmethod
|
||||
def search_memories(req: SearchMemoryRequest) -> MemoryGraph:
|
||||
"""Searches for memories by their content or associated entity names.
|
||||
|
||||
The search is case-insensitive. The results are sorted by timestamp in
|
||||
descending order (newest first).
|
||||
|
||||
Args:
|
||||
req: A SearchMemoryRequest object containing the search query and limit.
|
||||
|
||||
Returns:
|
||||
A MemoryGraph object containing the memories and entities that match
|
||||
the search query.
|
||||
"""
|
||||
graph = read_memory_graph()
|
||||
query = req.query.lower()
|
||||
|
||||
matching_memories = []
|
||||
for memory in graph.memories:
|
||||
# Search in content
|
||||
if query in memory.content.lower():
|
||||
matching_memories.append(memory)
|
||||
continue
|
||||
# Search in entity names
|
||||
if any(query in entity.lower() for entity in memory.entities):
|
||||
matching_memories.append(memory)
|
||||
|
||||
# Sort by timestamp (newest first) and apply limit
|
||||
matching_memories.sort(key=lambda m: m.timestamp, reverse=True)
|
||||
if len(matching_memories) > req.limit:
|
||||
matching_memories = matching_memories[: req.limit]
|
||||
|
||||
# Get entities referenced in matching memories
|
||||
referenced_entities = set()
|
||||
for memory in matching_memories:
|
||||
referenced_entities.update(memory.entities)
|
||||
|
||||
matching_entities = [e for e in graph.entities if e.name in referenced_entities]
|
||||
|
||||
return MemoryGraph(memories=matching_memories, entities=matching_entities)
|
||||
|
||||
@staticmethod
|
||||
def get_entity_memories(req: GetEntityRequest) -> MemoryGraph:
|
||||
"""Retrieves all memories associated with specific entities.
|
||||
|
||||
The results are sorted by timestamp in descending order (newest first).
|
||||
|
||||
Args:
|
||||
req: A GetEntityRequest object containing the list of entity names.
|
||||
|
||||
Returns:
|
||||
A MemoryGraph object containing the memories and entities associated
|
||||
with the specified entity names.
|
||||
"""
|
||||
graph = read_memory_graph()
|
||||
|
||||
# Check if memory references any of the requested entities
|
||||
matching_memories = [
|
||||
memory
|
||||
for memory in graph.memories
|
||||
if any(entity in memory.entities for entity in req.entities)
|
||||
]
|
||||
|
||||
# Sort by timestamp (newest first) and apply limit
|
||||
matching_memories.sort(key=lambda m: m.timestamp, reverse=True)
|
||||
if len(matching_memories) > req.limit:
|
||||
matching_memories = matching_memories[: req.limit]
|
||||
|
||||
# Get the requested entities
|
||||
matching_entities = [e for e in graph.entities if e.name in req.entities]
|
||||
|
||||
return MemoryGraph(memories=matching_memories, entities=matching_entities)
|
||||
|
||||
@staticmethod
|
||||
def delete_memories(req: DeleteMemoryRequest) -> dict[str, str]:
|
||||
"""Deletes specific memories from the memory graph by their IDs.
|
||||
|
||||
After deleting the memories, this method recalculates the memory counts
|
||||
for all entities and removes any entities that are no longer referenced
|
||||
by any memories.
|
||||
|
||||
Args:
|
||||
req: A DeleteMemoryRequest object containing the list of memory IDs
|
||||
to be deleted.
|
||||
|
||||
Returns:
|
||||
A dictionary with a message indicating how many memories were deleted.
|
||||
"""
|
||||
graph = read_memory_graph()
|
||||
original_count = len(graph.memories)
|
||||
|
||||
# Remove memories and track which entities were affected
|
||||
affected_entities = set()
|
||||
graph.memories = [
|
||||
m
|
||||
for m in graph.memories
|
||||
if m.id not in req.memory_ids or affected_entities.update(m.entities)
|
||||
]
|
||||
|
||||
deleted_count = original_count - len(graph.memories)
|
||||
|
||||
# Recalculate entity memory counts
|
||||
entity_counts = {}
|
||||
for memory in graph.memories:
|
||||
for entity_name in memory.entities:
|
||||
entity_counts[entity_name] = entity_counts.get(entity_name, 0) + 1
|
||||
|
||||
# Update entity counts and remove entities with zero memories
|
||||
graph.entities = [
|
||||
Entity(
|
||||
name=e.name, entity_type=e.entity_type, memory_count=entity_counts.get(e.name, 0)
|
||||
)
|
||||
for e in graph.entities
|
||||
if entity_counts.get(e.name, 0) > 0
|
||||
]
|
||||
|
||||
save_memory_graph(graph)
|
||||
return {"message": f"Deleted {deleted_count} memories"}
|
||||
|
||||
@staticmethod
|
||||
def get_summary() -> MemorySummary:
|
||||
"""Retrieves a summary of the memory graph statistics.
|
||||
|
||||
This includes the total number of memories and entities, the timestamps of
|
||||
the oldest and latest memories, the timespan in days, and a list of the
|
||||
most frequently referenced entities.
|
||||
|
||||
Returns:
|
||||
A MemorySummary object containing the statistics.
|
||||
"""
|
||||
graph = read_memory_graph()
|
||||
|
||||
if not graph.memories:
|
||||
return MemorySummary(
|
||||
total_memories=0,
|
||||
total_entities=0,
|
||||
oldest_memory=None,
|
||||
latest_memory=None,
|
||||
memory_timespan_days=None,
|
||||
top_entities=[],
|
||||
)
|
||||
|
||||
# Sort memories by timestamp
|
||||
sorted_memories = sorted(graph.memories, key=lambda m: m.timestamp)
|
||||
oldest = sorted_memories[0].timestamp
|
||||
latest = sorted_memories[-1].timestamp
|
||||
|
||||
# Calculate timespan
|
||||
try:
|
||||
oldest_dt = datetime.fromisoformat(oldest)
|
||||
latest_dt = datetime.fromisoformat(latest)
|
||||
timespan_days = (latest_dt - oldest_dt).days
|
||||
except ValueError:
|
||||
timespan_days = None
|
||||
|
||||
# Get top entities by memory count
|
||||
top_entities = sorted(graph.entities, key=lambda e: e.memory_count, reverse=True)[:10]
|
||||
|
||||
return MemorySummary(
|
||||
total_memories=len(graph.memories),
|
||||
total_entities=len(graph.entities),
|
||||
oldest_memory=oldest,
|
||||
latest_memory=latest,
|
||||
memory_timespan_days=timespan_days,
|
||||
top_entities=top_entities,
|
||||
)
|
||||
|
||||
def get_router(self) -> APIRouter:
|
||||
"""Creates and returns the FastAPI router for the memory tool.
|
||||
|
||||
|
@ -287,65 +61,104 @@ class MemoryTool(BaseTool):
|
|||
"""
|
||||
router = APIRouter()
|
||||
|
||||
router.add_api_route(
|
||||
@router.post(
|
||||
"/create",
|
||||
MemoryTool.create_memory,
|
||||
methods=["POST"],
|
||||
response_model=Memory,
|
||||
summary=(
|
||||
"Create a new memory. Before creating, check existing memories to avoid duplicates "
|
||||
"and to know what best to add. When you have information to store on multiple "
|
||||
"entities, favour creating multiple, detailed memories instead of one large, "
|
||||
"monolithic memory."
|
||||
"Store facts about people, places, topics, or things. Search existing memories "
|
||||
"first to avoid duplicates. Create separate memories for different facts"
|
||||
),
|
||||
)
|
||||
def post_create(request: CreateMemoryRequest) -> Memory:
|
||||
"""Store a new fact about entities.
|
||||
|
||||
router.add_api_route(
|
||||
Returns:
|
||||
Memory: The created memory.
|
||||
"""
|
||||
return create_memory(request)
|
||||
|
||||
@router.post(
|
||||
"/all",
|
||||
MemoryTool.get_all_memories,
|
||||
methods=["GET"],
|
||||
response_model=MemoryGraph,
|
||||
summary=("Retrieve all stored memories and entities (newest first)."),
|
||||
summary=(
|
||||
"Retrieve all stored facts and mentioned people/places/things (newest first). "
|
||||
"Use for memory overview or general 'what do you remember' questions"
|
||||
),
|
||||
)
|
||||
def post_all(req: GetAllMemoriesRequest | None = None) -> MemoryGraph:
|
||||
"""Retrieve all stored facts and entities.
|
||||
|
||||
router.add_api_route(
|
||||
Returns:
|
||||
MemoryGraph: All memories and entities.
|
||||
"""
|
||||
if req is None:
|
||||
req = GetAllMemoriesRequest()
|
||||
return get_all_memories(req)
|
||||
|
||||
@router.post(
|
||||
"/search",
|
||||
MemoryTool.search_memories,
|
||||
methods=["POST"],
|
||||
response_model=MemoryGraph,
|
||||
summary=(
|
||||
"Search for memories by keywords or entities. Use this to check for existing "
|
||||
"information before creating a new memory."
|
||||
"Search memories by keywords or names of people/places/things. Use before "
|
||||
"storing new info or when user asks about specific topics"
|
||||
),
|
||||
)
|
||||
def post_search(request: SearchMemoryRequest) -> MemoryGraph:
|
||||
"""Search memories by keywords or entity names.
|
||||
|
||||
router.add_api_route(
|
||||
Returns:
|
||||
MemoryGraph: Matching memories and entities.
|
||||
"""
|
||||
return search_memories(request)
|
||||
|
||||
@router.post(
|
||||
"/entity",
|
||||
MemoryTool.get_entity_memories,
|
||||
methods=["POST"],
|
||||
response_model=MemoryGraph,
|
||||
summary=(
|
||||
"Retrieve all memories for one or more entities. A useful first step to see what "
|
||||
"information is already stored for an entity before adding more."
|
||||
"Get facts about specific people, places, or things by exact name. Use when "
|
||||
"user asks about particular subjects or before adding more info"
|
||||
),
|
||||
)
|
||||
def post_entity(request: GetEntityRequest) -> MemoryGraph:
|
||||
"""Get all facts about specific entities.
|
||||
|
||||
router.add_api_route(
|
||||
Returns:
|
||||
MemoryGraph: Memories related to the specified entities.
|
||||
"""
|
||||
return get_entity_memories(request)
|
||||
|
||||
@router.post(
|
||||
"/delete",
|
||||
MemoryTool.delete_memories,
|
||||
methods=["POST"],
|
||||
summary=(
|
||||
"Remove specific memories by their IDs. Entity counts are automatically updated."
|
||||
"Delete specific facts by their IDs. Use to remove outdated, incorrect, or "
|
||||
"sensitive information. Mentioned people/places/things update automatically"
|
||||
),
|
||||
)
|
||||
def post_delete(request: DeleteMemoryRequest) -> dict[str, str]:
|
||||
"""Delete specific facts by their IDs.
|
||||
|
||||
router.add_api_route(
|
||||
Returns:
|
||||
dict[str, str]: Dictionary with deletion message.
|
||||
"""
|
||||
return delete_memories(request)
|
||||
|
||||
@router.post(
|
||||
"/stats",
|
||||
MemoryTool.get_summary,
|
||||
methods=["GET"],
|
||||
response_model=MemorySummary,
|
||||
summary=("Get summary statistics on total memories and lifespan/frequency."),
|
||||
summary=(
|
||||
"Get memory stats: total facts, unique mentions, timespan, top subjects. "
|
||||
"Use for storage insights or memory usage questions"
|
||||
),
|
||||
)
|
||||
def post_stats(req: GetMemorySummaryRequest | None = None) -> MemorySummary:
|
||||
"""Get memory statistics.
|
||||
|
||||
Returns:
|
||||
MemorySummary: Statistics about stored memories.
|
||||
"""
|
||||
if req is None:
|
||||
req = GetMemorySummaryRequest()
|
||||
return get_summary(req)
|
||||
|
||||
return router
|
||||
|
||||
|
|
|
@ -13,6 +13,44 @@ from typing import Any
|
|||
from pydantic import BaseModel, Field
|
||||
|
||||
|
||||
class CategoriesRequest(BaseModel):
|
||||
"""Represents a request to get available search categories.
|
||||
|
||||
This is an empty request model for consistency with POST endpoints.
|
||||
"""
|
||||
|
||||
|
||||
class CategoriesResponse(BaseModel):
|
||||
"""Represents the response containing the available search categories.
|
||||
|
||||
Attributes:
|
||||
categories: A list of the available search categories.
|
||||
note: An optional note providing additional information.
|
||||
"""
|
||||
|
||||
categories: list[str] = Field(..., description="Available search categories")
|
||||
note: str | None = Field(None, description="Additional notes about the response")
|
||||
|
||||
|
||||
class EnginesRequest(BaseModel):
|
||||
"""Represents a request to get available search engines.
|
||||
|
||||
This is an empty request model for consistency with POST endpoints.
|
||||
"""
|
||||
|
||||
|
||||
class EnginesResponse(BaseModel):
|
||||
"""Represents the response containing the available search engines.
|
||||
|
||||
Attributes:
|
||||
engines: A list of the available search engines.
|
||||
note: An optional note providing additional information.
|
||||
"""
|
||||
|
||||
engines: list[str] = Field(..., description="Available search engines")
|
||||
note: str | None = Field(None, description="Additional notes about the response")
|
||||
|
||||
|
||||
class SearchRequest(BaseModel):
|
||||
"""Represents a request to perform a search using the SearXNG tool.
|
||||
|
||||
|
@ -20,8 +58,8 @@ class SearchRequest(BaseModel):
|
|||
query: The search query string.
|
||||
categories: Comma-separated list of categories to search in.
|
||||
engines: Comma-separated list of search engines to use.
|
||||
language: The language code for the search (e.g., 'en', 'de').
|
||||
format: The desired response format (e.g., 'json', 'csv').
|
||||
language: The language code for the search (e.g. 'en', 'de').
|
||||
format: The desired response format (e.g. 'json', 'csv').
|
||||
pageno: The page number of the search results to retrieve.
|
||||
"""
|
||||
|
||||
|
@ -35,24 +73,6 @@ class SearchRequest(BaseModel):
|
|||
pageno: int | None = Field(1, description="Page number")
|
||||
|
||||
|
||||
class SearchResult(BaseModel):
|
||||
"""Represents a single search result returned by the SearXNG tool.
|
||||
|
||||
Attributes:
|
||||
title: The title of the search result.
|
||||
url: The URL of the search result.
|
||||
content: A snippet or summary of the result's content.
|
||||
engine: The search engine that provided the result.
|
||||
category: The category to which the result belongs.
|
||||
"""
|
||||
|
||||
title: str = Field(..., description="Result title")
|
||||
url: str = Field(..., description="Result URL")
|
||||
content: str | None = Field(None, description="Result content/snippet")
|
||||
engine: str = Field(..., description="Search engine that provided this result")
|
||||
category: str = Field(..., description="Category of the result")
|
||||
|
||||
|
||||
class SearchResponse(BaseModel):
|
||||
"""Represents the full response from a search operation.
|
||||
|
||||
|
@ -73,37 +93,19 @@ class SearchResponse(BaseModel):
|
|||
engines: list[str] | None = Field(None, description="Engines used for search")
|
||||
|
||||
|
||||
class CategoriesResponse(BaseModel):
|
||||
"""Represents the response containing the available search categories.
|
||||
class SearchResult(BaseModel):
|
||||
"""Represents a single search result returned by the SearXNG tool.
|
||||
|
||||
Attributes:
|
||||
categories: A list of the available search categories.
|
||||
note: An optional note providing additional information.
|
||||
title: The title of the search result.
|
||||
url: The URL of the search result.
|
||||
content: A snippet or summary of the result's content.
|
||||
engine: The search engine that provided the result.
|
||||
category: The category to which the result belongs.
|
||||
"""
|
||||
|
||||
categories: list[str] = Field(..., description="Available search categories")
|
||||
note: str | None = Field(None, description="Additional notes about the response")
|
||||
|
||||
|
||||
class EnginesResponse(BaseModel):
|
||||
"""Represents the response containing the available search engines.
|
||||
|
||||
Attributes:
|
||||
engines: A list of the available search engines.
|
||||
note: An optional note providing additional information.
|
||||
"""
|
||||
|
||||
engines: list[str] = Field(..., description="Available search engines")
|
||||
note: str | None = Field(None, description="Additional notes about the response")
|
||||
|
||||
|
||||
class HealthResponse(BaseModel):
|
||||
"""Represents the health status of the SearXNG service.
|
||||
|
||||
Attributes:
|
||||
status: The health status of the service (e.g., 'OK').
|
||||
searxng_url: The URL of the SearXNG instance being checked.
|
||||
"""
|
||||
|
||||
status: str = Field(..., description="Service health status")
|
||||
searxng_url: str = Field(..., description="SearXNG instance URL")
|
||||
title: str = Field(..., description="Result title")
|
||||
url: str = Field(..., description="Result URL")
|
||||
content: str | None = Field(None, description="Result content/snippet")
|
||||
engine: str = Field(..., description="Search engine that provided this result")
|
||||
category: str = Field(..., description="Category of the result")
|
||||
|
|
97
openapi_mcp_server/tools/searxng/responses.py
Normal file
97
openapi_mcp_server/tools/searxng/responses.py
Normal file
|
@ -0,0 +1,97 @@
|
|||
"""This module contains response generation logic for the SearXNG tool.
|
||||
|
||||
It handles the core functionality of performing searches against a SearXNG
|
||||
instance and processing the responses into structured data models.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
|
||||
import requests
|
||||
from fastapi import HTTPException
|
||||
|
||||
from openapi_mcp_server.core.config import get_app_config
|
||||
|
||||
from .models import SearchResponse, SearchResult
|
||||
|
||||
|
||||
def perform_search(
|
||||
query: str,
|
||||
categories: str | None = None,
|
||||
engines: str | None = None,
|
||||
language: str = "en",
|
||||
response_format: str = "json",
|
||||
pageno: int = 1,
|
||||
) -> SearchResponse:
|
||||
"""Performs the actual search operation by querying the SearXNG instance.
|
||||
|
||||
This function constructs the search request, sends it to the
|
||||
SearXNG server, and processes the response.
|
||||
|
||||
Args:
|
||||
query: The search query string.
|
||||
categories: A comma-separated list of categories to search in.
|
||||
engines: A comma-separated list of search engines to use.
|
||||
language: The language code for the search.
|
||||
response_format: The desired response format (only 'json' is supported).
|
||||
pageno: The page number of the search results.
|
||||
|
||||
Returns:
|
||||
A SearchResponse object containing the search results.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the search request fails, the service is
|
||||
unavailable, or the response is invalid.
|
||||
"""
|
||||
app_config = get_app_config()
|
||||
params: dict[str, str | int] = {
|
||||
"q": query,
|
||||
"format": response_format,
|
||||
"lang": language,
|
||||
"pageno": pageno,
|
||||
}
|
||||
|
||||
if categories:
|
||||
params["categories"] = categories
|
||||
|
||||
if engines:
|
||||
params["engines"] = engines
|
||||
|
||||
try:
|
||||
response = requests.get(f"{app_config.searxng_base_url}/search", params=params, timeout=30)
|
||||
response.raise_for_status()
|
||||
|
||||
if response_format.lower() != "json":
|
||||
raise HTTPException(status_code=400, detail="Only JSON format is supported")
|
||||
|
||||
data = response.json()
|
||||
|
||||
results = []
|
||||
for result in data.get("results", []):
|
||||
search_result = SearchResult(
|
||||
title=result.get("title", ""),
|
||||
url=result.get("url", ""),
|
||||
content=result.get("content", result.get("snippet", "")),
|
||||
engine=result.get("engine", "unknown"),
|
||||
category=result.get("category", "general"),
|
||||
)
|
||||
results.append(search_result)
|
||||
|
||||
return SearchResponse(
|
||||
query=query,
|
||||
number_of_results=data.get("number_of_results", len(results)),
|
||||
results=results,
|
||||
infoboxes=data.get("infoboxes"),
|
||||
suggestions=data.get("suggestions"),
|
||||
engines=data.get("engines"),
|
||||
)
|
||||
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise HTTPException(status_code=503, detail=f"SearXNG service unavailable: {e!s}") from e
|
||||
except json.JSONDecodeError as e:
|
||||
raise HTTPException(
|
||||
status_code=502, detail=f"Invalid JSON response from SearXNG: {e!s}"
|
||||
) from e
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=f"Search error: {e!s}") from e
|
|
@ -8,21 +8,21 @@ configured SearXNG instance.
|
|||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
|
||||
import requests
|
||||
from fastapi import APIRouter, HTTPException
|
||||
from fastapi import APIRouter
|
||||
|
||||
from openapi_mcp_server.core.config import get_app_config
|
||||
from openapi_mcp_server.tools.base import BaseTool
|
||||
|
||||
from .models import (
|
||||
CategoriesRequest,
|
||||
CategoriesResponse,
|
||||
EnginesRequest,
|
||||
EnginesResponse,
|
||||
SearchRequest,
|
||||
SearchResponse,
|
||||
SearchResult,
|
||||
)
|
||||
from .responses import perform_search
|
||||
|
||||
|
||||
class SearxngTool(BaseTool):
|
||||
|
@ -57,9 +57,13 @@ class SearxngTool(BaseTool):
|
|||
@router.post(
|
||||
"/search",
|
||||
response_model=SearchResponse,
|
||||
summary="Search the web across multiple search engines",
|
||||
summary=(
|
||||
"Search across multiple engines for current web information. Use when you need "
|
||||
"recent news, research, or information not in your training data. Returns "
|
||||
"ranked results with snippets"
|
||||
),
|
||||
)
|
||||
def search(request: SearchRequest) -> SearchResponse:
|
||||
def post_search(request: SearchRequest) -> SearchResponse:
|
||||
"""Performs a web search using the configured SearXNG instance.
|
||||
|
||||
This endpoint takes a search query and various parameters to control
|
||||
|
@ -71,7 +75,7 @@ class SearxngTool(BaseTool):
|
|||
Returns:
|
||||
A SearchResponse object containing the search results.
|
||||
"""
|
||||
return SearxngTool._perform_search(
|
||||
return perform_search(
|
||||
request.query,
|
||||
request.categories,
|
||||
request.engines,
|
||||
|
@ -80,17 +84,23 @@ class SearxngTool(BaseTool):
|
|||
request.pageno or 1,
|
||||
)
|
||||
|
||||
@router.get(
|
||||
@router.post(
|
||||
"/categories",
|
||||
response_model=CategoriesResponse,
|
||||
summary="Get available search categories (general, images, news, etc)",
|
||||
summary=(
|
||||
"Get available search categories (images, news, videos, science, etc). Use before "
|
||||
"searching to target specific content types for better results"
|
||||
),
|
||||
)
|
||||
def categories() -> CategoriesResponse:
|
||||
def post_categories(req: CategoriesRequest | None = None) -> CategoriesResponse: # noqa: ARG001
|
||||
"""Retrieves the available search categories from the SearXNG instance.
|
||||
|
||||
If the SearXNG instance is unavailable, it returns a default list of
|
||||
common categories.
|
||||
|
||||
Args:
|
||||
req: A CategoriesRequest object (empty, for consistency).
|
||||
|
||||
Returns:
|
||||
A CategoriesResponse object containing the list of available
|
||||
search categories.
|
||||
|
@ -121,14 +131,20 @@ class SearxngTool(BaseTool):
|
|||
note=f"Using default categories (SearXNG config unavailable: {e!s})",
|
||||
)
|
||||
|
||||
@router.get(
|
||||
@router.post(
|
||||
"/engines",
|
||||
response_model=EnginesResponse,
|
||||
summary="Get list of available search engines (Google, Bing, DuckDuckGo, etc)",
|
||||
summary=(
|
||||
"List available search engines (Google, DuckDuckGo, Bing, etc). Use to limit "
|
||||
"search to specific engines when you need results from particular sources"
|
||||
),
|
||||
)
|
||||
def engines() -> EnginesResponse:
|
||||
def post_engines(req: EnginesRequest | None = None) -> EnginesResponse: # noqa: ARG001
|
||||
"""Retrieves the available search engines from the SearXNG instance.
|
||||
|
||||
Args:
|
||||
req: An EnginesRequest object (empty, for consistency).
|
||||
|
||||
Returns:
|
||||
An EnginesResponse object containing the list of available
|
||||
search engines.
|
||||
|
@ -151,85 +167,5 @@ class SearxngTool(BaseTool):
|
|||
|
||||
return router
|
||||
|
||||
@staticmethod
|
||||
def _perform_search(
|
||||
query: str,
|
||||
categories: str | None = None,
|
||||
engines: str | None = None,
|
||||
language: str = "en",
|
||||
response_format: str = "json",
|
||||
pageno: int = 1,
|
||||
) -> SearchResponse:
|
||||
"""Performs the actual search operation by querying the SearXNG instance.
|
||||
|
||||
This static method constructs the search request, sends it to the
|
||||
SearXNG server, and processes the response.
|
||||
|
||||
Args:
|
||||
query: The search query string.
|
||||
categories: A comma-separated list of categories to search in.
|
||||
engines: A comma-separated list of search engines to use.
|
||||
language: The language code for the search.
|
||||
response_format: The desired response format (only 'json' is supported).
|
||||
pageno: The page number of the search results.
|
||||
|
||||
Returns:
|
||||
A SearchResponse object containing the search results.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the search request fails, the service is
|
||||
unavailable, or the response is invalid.
|
||||
"""
|
||||
app_config = get_app_config()
|
||||
params = {"q": query, "format": response_format, "lang": language, "pageno": pageno}
|
||||
|
||||
if categories:
|
||||
params["categories"] = categories
|
||||
|
||||
if engines:
|
||||
params["engines"] = engines
|
||||
|
||||
try:
|
||||
response = requests.get(
|
||||
f"{app_config.searxng_base_url}/search", params=params, timeout=30
|
||||
)
|
||||
response.raise_for_status()
|
||||
|
||||
if response_format.lower() != "json":
|
||||
raise HTTPException(status_code=400, detail="Only JSON format is supported")
|
||||
|
||||
data = response.json()
|
||||
|
||||
results = []
|
||||
for result in data.get("results", []):
|
||||
search_result = SearchResult(
|
||||
title=result.get("title", ""),
|
||||
url=result.get("url", ""),
|
||||
content=result.get("content", result.get("snippet", "")),
|
||||
engine=result.get("engine", "unknown"),
|
||||
category=result.get("category", "general"),
|
||||
)
|
||||
results.append(search_result)
|
||||
|
||||
return SearchResponse(
|
||||
query=query,
|
||||
number_of_results=data.get("number_of_results", len(results)),
|
||||
results=results,
|
||||
infoboxes=data.get("infoboxes"),
|
||||
suggestions=data.get("suggestions"),
|
||||
engines=data.get("engines"),
|
||||
)
|
||||
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise HTTPException(
|
||||
status_code=503, detail=f"SearXNG service unavailable: {e!s}"
|
||||
) from e
|
||||
except json.JSONDecodeError as e:
|
||||
raise HTTPException(
|
||||
status_code=502, detail=f"Invalid JSON response from SearXNG: {e!s}"
|
||||
) from e
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=f"Search error: {e!s}") from e
|
||||
|
||||
|
||||
tool = SearxngTool()
|
||||
|
|
|
@ -13,14 +13,14 @@ from typing import Literal
|
|||
from pydantic import BaseModel, Field
|
||||
|
||||
|
||||
class GetTimeInput(BaseModel):
|
||||
"""Represents the input for getting the current time.
|
||||
class ConvertedTimeResponse(BaseModel):
|
||||
"""Represents the response containing a converted time.
|
||||
|
||||
Attributes:
|
||||
timezone: The IANA timezone for which to get the current time.
|
||||
converted_time: The converted time in ISO 8601 format.
|
||||
"""
|
||||
|
||||
timezone: str = Field("UTC", description="IANA timezone name (e.g. 'UTC', 'Europe/London')")
|
||||
converted_time: str = Field(..., description="Converted time in ISO format")
|
||||
|
||||
|
||||
class ConvertTimeInput(BaseModel):
|
||||
|
@ -33,7 +33,7 @@ class ConvertTimeInput(BaseModel):
|
|||
"""
|
||||
|
||||
timestamp: str = Field(
|
||||
..., description="ISO 8601 formatted time string (e.g., 2024-01-01T12:00:00Z)"
|
||||
..., description="ISO 8601 formatted time string (e.g. 2024-01-01T12:00:00Z)"
|
||||
)
|
||||
from_tz: str = Field(
|
||||
..., description="Original IANA time zone of input (e.g. UTC or Europe/Berlin)"
|
||||
|
@ -57,6 +57,45 @@ class ElapsedTimeInput(BaseModel):
|
|||
)
|
||||
|
||||
|
||||
class ElapsedTimeResponse(BaseModel):
|
||||
"""Represents the response for an elapsed time calculation.
|
||||
|
||||
Attributes:
|
||||
elapsed: The elapsed time in the specified units.
|
||||
unit: The unit of the elapsed time.
|
||||
"""
|
||||
|
||||
elapsed: float = Field(..., description="Elapsed time in specified units")
|
||||
unit: str = Field(..., description="Unit of elapsed time")
|
||||
|
||||
|
||||
class GetTimeInput(BaseModel):
|
||||
"""Represents the input for getting the current time.
|
||||
|
||||
Attributes:
|
||||
timezone: The IANA timezone for which to get the current time.
|
||||
"""
|
||||
|
||||
timezone: str = Field("UTC", description="IANA timezone name (e.g. 'UTC', 'Europe/London')")
|
||||
|
||||
|
||||
class ListTimeZonesRequest(BaseModel):
|
||||
"""Represents a request to list all valid IANA timezone names.
|
||||
|
||||
This is an empty request model for consistency with POST endpoints.
|
||||
"""
|
||||
|
||||
|
||||
class ParsedTimestampResponse(BaseModel):
|
||||
"""Represents the response containing a parsed timestamp.
|
||||
|
||||
Attributes:
|
||||
utc: The parsed timestamp in UTC ISO 8601 format.
|
||||
"""
|
||||
|
||||
utc: str = Field(..., description="Parsed timestamp in UTC ISO format")
|
||||
|
||||
|
||||
class ParseTimestampInput(BaseModel):
|
||||
"""Represents the input for parsing a flexible timestamp string.
|
||||
|
||||
|
@ -66,23 +105,11 @@ class ParseTimestampInput(BaseModel):
|
|||
"""
|
||||
|
||||
timestamp: str = Field(
|
||||
..., description="Flexible input timestamp string (e.g., 2024-06-01 12:00 PM)"
|
||||
..., description="Flexible input timestamp string (e.g. 2024-06-01 12:00 PM)"
|
||||
)
|
||||
timezone: str = Field("UTC", description="Assumed timezone if none is specified in input")
|
||||
|
||||
|
||||
class UnixToIsoInput(BaseModel):
|
||||
"""Represents the input for converting a Unix timestamp to ISO 8601 format.
|
||||
|
||||
Attributes:
|
||||
timestamp: The Unix epoch timestamp (seconds since 1970-01-01).
|
||||
timezone: The target timezone for the output ISO string.
|
||||
"""
|
||||
|
||||
timestamp: float = Field(..., description="Unix epoch timestamp (seconds since 1970-01-01)")
|
||||
timezone: str = Field("UTC", description="Target timezone for output (defaults to UTC)")
|
||||
|
||||
|
||||
class TimeResponse(BaseModel):
|
||||
"""Represents the response containing the current time and timezone.
|
||||
|
||||
|
@ -95,36 +122,16 @@ class TimeResponse(BaseModel):
|
|||
tz: str = Field(..., description="IANA timezone name used")
|
||||
|
||||
|
||||
class ConvertedTimeResponse(BaseModel):
|
||||
"""Represents the response containing a converted time.
|
||||
class UnixToIsoInput(BaseModel):
|
||||
"""Represents the input for converting a Unix timestamp to ISO 8601 format.
|
||||
|
||||
Attributes:
|
||||
converted_time: The converted time in ISO 8601 format.
|
||||
timestamp: The Unix epoch timestamp (seconds since 1970-01-01).
|
||||
timezone: The target timezone for the output ISO string.
|
||||
"""
|
||||
|
||||
converted_time: str = Field(..., description="Converted time in ISO format")
|
||||
|
||||
|
||||
class ElapsedTimeResponse(BaseModel):
|
||||
"""Represents the response for an elapsed time calculation.
|
||||
|
||||
Attributes:
|
||||
elapsed: The elapsed time in the specified units.
|
||||
unit: The unit of the elapsed time.
|
||||
"""
|
||||
|
||||
elapsed: float = Field(..., description="Elapsed time in specified units")
|
||||
unit: str = Field(..., description="Unit of elapsed time")
|
||||
|
||||
|
||||
class ParsedTimestampResponse(BaseModel):
|
||||
"""Represents the response containing a parsed timestamp.
|
||||
|
||||
Attributes:
|
||||
utc: The parsed timestamp in UTC ISO 8601 format.
|
||||
"""
|
||||
|
||||
utc: str = Field(..., description="Parsed timestamp in UTC ISO format")
|
||||
timestamp: float = Field(..., description="Unix epoch timestamp (seconds since 1970-01-01)")
|
||||
timezone: str = Field("UTC", description="Target timezone for output (defaults to UTC)")
|
||||
|
||||
|
||||
class UnixToIsoResponse(BaseModel):
|
||||
|
|
167
openapi_mcp_server/tools/time/responses.py
Normal file
167
openapi_mcp_server/tools/time/responses.py
Normal file
|
@ -0,0 +1,167 @@
|
|||
"""This module contains response generation logic for the Time tool.
|
||||
|
||||
It handles the core functionality of time-related operations including timezone
|
||||
conversions, timestamp parsing, elapsed time calculations, and Unix timestamp
|
||||
conversions.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import UTC, datetime
|
||||
|
||||
import pytz
|
||||
from dateutil import parser as dateutil_parser
|
||||
from fastapi import HTTPException
|
||||
|
||||
from .models import (
|
||||
ConvertedTimeResponse,
|
||||
ConvertTimeInput,
|
||||
ElapsedTimeInput,
|
||||
ElapsedTimeResponse,
|
||||
ListTimeZonesRequest,
|
||||
ParsedTimestampResponse,
|
||||
ParseTimestampInput,
|
||||
TimeResponse,
|
||||
UnixToIsoInput,
|
||||
UnixToIsoResponse,
|
||||
)
|
||||
|
||||
|
||||
def get_current(timezone: str = "UTC") -> TimeResponse:
|
||||
"""Retrieves the current time in the specified timezone.
|
||||
|
||||
Args:
|
||||
timezone: The IANA timezone name (e.g. 'UTC', 'America/New_York').
|
||||
Defaults to 'UTC'.
|
||||
|
||||
Returns:
|
||||
A TimeResponse object containing the current time in ISO format
|
||||
and the timezone used.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the provided timezone is invalid.
|
||||
"""
|
||||
try:
|
||||
tz = pytz.timezone(timezone)
|
||||
now = datetime.now(tz)
|
||||
return TimeResponse(time=now.isoformat(), tz=timezone)
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid timezone: {e}") from e
|
||||
|
||||
|
||||
def unix_to_iso(data: UnixToIsoInput) -> UnixToIsoResponse:
|
||||
"""Converts a Unix epoch timestamp to an ISO 8601 formatted string.
|
||||
|
||||
Args:
|
||||
data: A UnixToIsoInput object containing the Unix timestamp and an
|
||||
optional target timezone.
|
||||
|
||||
Returns:
|
||||
A UnixToIsoResponse object with the ISO formatted timestamp.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the timestamp or timezone is invalid.
|
||||
"""
|
||||
try:
|
||||
dt = datetime.fromtimestamp(data.timestamp, tz=UTC)
|
||||
if data.timezone and data.timezone != "UTC":
|
||||
target_tz = pytz.timezone(data.timezone)
|
||||
dt = dt.astimezone(target_tz)
|
||||
return UnixToIsoResponse(iso_time=dt.isoformat())
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid timestamp or timezone: {e}") from e
|
||||
|
||||
|
||||
def convert_time(data: ConvertTimeInput) -> ConvertedTimeResponse:
|
||||
"""Converts a timestamp from one timezone to another.
|
||||
|
||||
Args:
|
||||
data: A ConvertTimeInput object containing the timestamp and the
|
||||
source and target timezones.
|
||||
|
||||
Returns:
|
||||
A ConvertedTimeResponse object with the converted timestamp.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the timestamp or any of the timezones are invalid.
|
||||
"""
|
||||
try:
|
||||
from_zone = pytz.timezone(data.from_tz)
|
||||
to_zone = pytz.timezone(data.to_tz)
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid timezone: {e}") from e
|
||||
|
||||
try:
|
||||
dt = dateutil_parser.parse(data.timestamp)
|
||||
dt = from_zone.localize(dt) if dt.tzinfo is None else dt.astimezone(from_zone)
|
||||
converted = dt.astimezone(to_zone)
|
||||
return ConvertedTimeResponse(converted_time=converted.isoformat())
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid timestamp: {e}") from e
|
||||
|
||||
|
||||
def elapsed_time(data: ElapsedTimeInput) -> ElapsedTimeResponse:
|
||||
"""Calculates the elapsed time between two timestamps.
|
||||
|
||||
Args:
|
||||
data: An ElapsedTimeInput object containing the start and end
|
||||
timestamps and the desired units for the result.
|
||||
|
||||
Returns:
|
||||
An ElapsedTimeResponse object with the calculated elapsed time.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the timestamps are invalid.
|
||||
"""
|
||||
try:
|
||||
start_dt = dateutil_parser.parse(data.start)
|
||||
end_dt = dateutil_parser.parse(data.end)
|
||||
delta = end_dt - start_dt
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid timestamps: {e}") from e
|
||||
|
||||
seconds = delta.total_seconds()
|
||||
result = {
|
||||
"seconds": seconds,
|
||||
"minutes": seconds / 60,
|
||||
"hours": seconds / 3600,
|
||||
"days": seconds / 86400,
|
||||
}
|
||||
|
||||
return ElapsedTimeResponse(elapsed=result[data.units], unit=data.units)
|
||||
|
||||
|
||||
def parse_timestamp(data: ParseTimestampInput) -> ParsedTimestampResponse:
|
||||
"""Parses a human-readable timestamp string into a standardised format.
|
||||
|
||||
Args:
|
||||
data: A ParseTimestampInput object containing the timestamp string
|
||||
and an optional timezone.
|
||||
|
||||
Returns:
|
||||
A ParsedTimestampResponse object with the timestamp in UTC ISO format.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the timestamp string cannot be parsed.
|
||||
"""
|
||||
try:
|
||||
tz = pytz.timezone(data.timezone)
|
||||
dt = dateutil_parser.parse(data.timestamp)
|
||||
if dt.tzinfo is None:
|
||||
dt = tz.localize(dt)
|
||||
dt_utc = dt.astimezone(pytz.utc)
|
||||
return ParsedTimestampResponse(utc=dt_utc.isoformat())
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=f"Could not parse: {e}") from e
|
||||
|
||||
|
||||
def list_time_zones(req: ListTimeZonesRequest) -> list[str]: # noqa: ARG001
|
||||
"""Retrieves a list of all valid IANA timezone names.
|
||||
|
||||
Args:
|
||||
req: A ListTimeZonesRequest object (empty, for consistency).
|
||||
|
||||
Returns:
|
||||
A list of strings, where each string is a valid IANA timezone.
|
||||
"""
|
||||
return list(pytz.all_timezones)
|
|
@ -8,11 +8,7 @@ a reliable and secure source for time information within the system.
|
|||
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import UTC, datetime
|
||||
|
||||
import pytz
|
||||
from dateutil import parser as dateutil_parser
|
||||
from fastapi import APIRouter, HTTPException
|
||||
from fastapi import APIRouter
|
||||
|
||||
from openapi_mcp_server.tools.base import BaseTool
|
||||
|
||||
|
@ -22,12 +18,21 @@ from .models import (
|
|||
ElapsedTimeInput,
|
||||
ElapsedTimeResponse,
|
||||
GetTimeInput,
|
||||
ListTimeZonesRequest,
|
||||
ParsedTimestampResponse,
|
||||
ParseTimestampInput,
|
||||
TimeResponse,
|
||||
UnixToIsoInput,
|
||||
UnixToIsoResponse,
|
||||
)
|
||||
from .responses import (
|
||||
convert_time,
|
||||
elapsed_time,
|
||||
get_current,
|
||||
list_time_zones,
|
||||
parse_timestamp,
|
||||
unix_to_iso,
|
||||
)
|
||||
|
||||
|
||||
class TimeTool(BaseTool):
|
||||
|
@ -47,145 +52,6 @@ class TimeTool(BaseTool):
|
|||
description="Provides secure UTC/local time retrieval and formatting",
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def get_current(timezone: str = "UTC") -> TimeResponse:
|
||||
"""Retrieves the current time in the specified timezone.
|
||||
|
||||
Args:
|
||||
timezone: The IANA timezone name (e.g., 'UTC', 'America/New_York').
|
||||
Defaults to 'UTC'.
|
||||
|
||||
Returns:
|
||||
A TimeResponse object containing the current time in ISO format
|
||||
and the timezone used.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the provided timezone is invalid.
|
||||
"""
|
||||
try:
|
||||
tz = pytz.timezone(timezone)
|
||||
now = datetime.now(tz)
|
||||
return TimeResponse(time=now.isoformat(), tz=timezone)
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid timezone: {e}") from e
|
||||
|
||||
@staticmethod
|
||||
def unix_to_iso(data: UnixToIsoInput) -> UnixToIsoResponse:
|
||||
"""Converts a Unix epoch timestamp to an ISO 8601 formatted string.
|
||||
|
||||
Args:
|
||||
data: A UnixToIsoInput object containing the Unix timestamp and an
|
||||
optional target timezone.
|
||||
|
||||
Returns:
|
||||
A UnixToIsoResponse object with the ISO formatted timestamp.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the timestamp or timezone is invalid.
|
||||
"""
|
||||
try:
|
||||
dt = datetime.fromtimestamp(data.timestamp, tz=UTC)
|
||||
if data.timezone and data.timezone != "UTC":
|
||||
target_tz = pytz.timezone(data.timezone)
|
||||
dt = dt.astimezone(target_tz)
|
||||
return UnixToIsoResponse(iso_time=dt.isoformat())
|
||||
except Exception as e:
|
||||
raise HTTPException(
|
||||
status_code=400, detail=f"Invalid timestamp or timezone: {e}"
|
||||
) from e
|
||||
|
||||
@staticmethod
|
||||
def convert_time(data: ConvertTimeInput) -> ConvertedTimeResponse:
|
||||
"""Converts a timestamp from one timezone to another.
|
||||
|
||||
Args:
|
||||
data: A ConvertTimeInput object containing the timestamp and the
|
||||
source and target timezones.
|
||||
|
||||
Returns:
|
||||
A ConvertedTimeResponse object with the converted timestamp.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the timestamp or any of the timezones are invalid.
|
||||
"""
|
||||
try:
|
||||
from_zone = pytz.timezone(data.from_tz)
|
||||
to_zone = pytz.timezone(data.to_tz)
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid timezone: {e}") from e
|
||||
|
||||
try:
|
||||
dt = dateutil_parser.parse(data.timestamp)
|
||||
dt = from_zone.localize(dt) if dt.tzinfo is None else dt.astimezone(from_zone)
|
||||
converted = dt.astimezone(to_zone)
|
||||
return ConvertedTimeResponse(converted_time=converted.isoformat())
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid timestamp: {e}") from e
|
||||
|
||||
@staticmethod
|
||||
def elapsed_time(data: ElapsedTimeInput) -> ElapsedTimeResponse:
|
||||
"""Calculates the elapsed time between two timestamps.
|
||||
|
||||
Args:
|
||||
data: An ElapsedTimeInput object containing the start and end
|
||||
timestamps and the desired units for the result.
|
||||
|
||||
Returns:
|
||||
An ElapsedTimeResponse object with the calculated elapsed time.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the timestamps are invalid.
|
||||
"""
|
||||
try:
|
||||
start_dt = dateutil_parser.parse(data.start)
|
||||
end_dt = dateutil_parser.parse(data.end)
|
||||
delta = end_dt - start_dt
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid timestamps: {e}") from e
|
||||
|
||||
seconds = delta.total_seconds()
|
||||
result = {
|
||||
"seconds": seconds,
|
||||
"minutes": seconds / 60,
|
||||
"hours": seconds / 3600,
|
||||
"days": seconds / 86400,
|
||||
}
|
||||
|
||||
return ElapsedTimeResponse(elapsed=result[data.units], unit=data.units)
|
||||
|
||||
@staticmethod
|
||||
def parse_timestamp(data: ParseTimestampInput) -> ParsedTimestampResponse:
|
||||
"""Parses a human-readable timestamp string into a standardised format.
|
||||
|
||||
Args:
|
||||
data: A ParseTimestampInput object containing the timestamp string
|
||||
and an optional timezone.
|
||||
|
||||
Returns:
|
||||
A ParsedTimestampResponse object with the timestamp in UTC ISO format.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the timestamp string cannot be parsed.
|
||||
"""
|
||||
try:
|
||||
tz = pytz.timezone(data.timezone)
|
||||
dt = dateutil_parser.parse(data.timestamp)
|
||||
if dt.tzinfo is None:
|
||||
dt = tz.localize(dt)
|
||||
dt_utc = dt.astimezone(pytz.utc)
|
||||
return ParsedTimestampResponse(utc=dt_utc.isoformat())
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=f"Could not parse: {e}") from e
|
||||
|
||||
@staticmethod
|
||||
def list_time_zones() -> list[str]:
|
||||
"""Retrieves a list of all valid IANA timezone names.
|
||||
|
||||
Returns:
|
||||
A list of strings, where each string is a valid IANA timezone.
|
||||
"""
|
||||
return list(pytz.all_timezones)
|
||||
|
||||
def get_router(self) -> APIRouter:
|
||||
"""Creates and returns the FastAPI router for the time tool.
|
||||
|
||||
|
@ -197,83 +63,90 @@ class TimeTool(BaseTool):
|
|||
router = APIRouter()
|
||||
|
||||
@router.post(
|
||||
"/get_time",
|
||||
"/current",
|
||||
response_model=TimeResponse,
|
||||
summary="Get current time in specified IANA timezone (defaults to UTC)",
|
||||
summary=("Get current time in any timezone. Defaults to UTC if no timezone specified"),
|
||||
)
|
||||
def get_time(data: GetTimeInput | None = None) -> TimeResponse:
|
||||
def post_current(data: GetTimeInput | None = None) -> TimeResponse:
|
||||
"""Retrieves the current time in the specified timezone.
|
||||
|
||||
Returns:
|
||||
TimeResponse: The current time in the specified timezone.
|
||||
"""
|
||||
if data is None:
|
||||
data = GetTimeInput()
|
||||
return TimeTool.get_current(data.timezone)
|
||||
data = GetTimeInput(timezone="UTC")
|
||||
return get_current(data.timezone)
|
||||
|
||||
@router.post(
|
||||
"/unix_to_iso",
|
||||
response_model=UnixToIsoResponse,
|
||||
summary="Convert Unix epoch timestamp to ISO format",
|
||||
summary=("Convert Unix epoch timestamp to human-readable ISO format"),
|
||||
)
|
||||
def unix_to_iso(data: UnixToIsoInput) -> UnixToIsoResponse:
|
||||
def post_unix_to_iso(data: UnixToIsoInput) -> UnixToIsoResponse:
|
||||
"""Converts a Unix epoch timestamp to an ISO 8601 formatted string.
|
||||
|
||||
Returns:
|
||||
UnixToIsoResponse: The ISO 8601 formatted timestamp.
|
||||
"""
|
||||
return TimeTool.unix_to_iso(data)
|
||||
return unix_to_iso(data)
|
||||
|
||||
@router.post(
|
||||
"/convert_time",
|
||||
"/convert",
|
||||
response_model=ConvertedTimeResponse,
|
||||
summary="Convert timestamp from one timezone to another",
|
||||
summary=("Convert time between timezones"),
|
||||
)
|
||||
def convert_time(data: ConvertTimeInput) -> ConvertedTimeResponse:
|
||||
def post_convert(data: ConvertTimeInput) -> ConvertedTimeResponse:
|
||||
"""Converts a timestamp from one timezone to another.
|
||||
|
||||
Returns:
|
||||
ConvertedTimeResponse: The converted timestamp.
|
||||
"""
|
||||
return TimeTool.convert_time(data)
|
||||
return convert_time(data)
|
||||
|
||||
@router.post(
|
||||
"/elapsed_time",
|
||||
"/elapsed",
|
||||
response_model=ElapsedTimeResponse,
|
||||
summary="Calculate time difference between two timestamps",
|
||||
summary=("Calculate time elapsed between two ISO timestamps"),
|
||||
)
|
||||
def elapsed_time(data: ElapsedTimeInput) -> ElapsedTimeResponse:
|
||||
def post_elapsed(data: ElapsedTimeInput) -> ElapsedTimeResponse:
|
||||
"""Calculates the elapsed time between two timestamps.
|
||||
|
||||
Returns:
|
||||
ElapsedTimeResponse: The elapsed time.
|
||||
"""
|
||||
return TimeTool.elapsed_time(data)
|
||||
return elapsed_time(data)
|
||||
|
||||
@router.post(
|
||||
"/parse_timestamp",
|
||||
"/parse",
|
||||
response_model=ParsedTimestampResponse,
|
||||
summary=(
|
||||
"Parse flexible human-readable timestamps (e.g. 'June 1st 2024 3:30 PM', "
|
||||
"'tomorrow at noon', '2024-06-01 15:30') into standardized UTC ISO format"
|
||||
"Parse flexible human dates/times like 'tomorrow at 3pm', 'June 15th', 'next "
|
||||
"Friday' into standard ISO format"
|
||||
),
|
||||
)
|
||||
def parse_timestamp(data: ParseTimestampInput) -> ParsedTimestampResponse:
|
||||
def post_parse(data: ParseTimestampInput) -> ParsedTimestampResponse:
|
||||
"""Parses a human-readable timestamp string into a standardised format.
|
||||
|
||||
Returns:
|
||||
ParsedTimestampResponse: The parsed timestamp in UTC ISO format.
|
||||
"""
|
||||
return TimeTool.parse_timestamp(data)
|
||||
return parse_timestamp(data)
|
||||
|
||||
@router.get("/list_time_zones", summary="Get list of all valid IANA timezone names")
|
||||
def list_time_zones() -> list[str]:
|
||||
@router.post(
|
||||
"/list_zones",
|
||||
summary=(
|
||||
"List all valid IANA timezone names (e.g. 'America/New_York', 'Europe/London')"
|
||||
),
|
||||
)
|
||||
def post_list_zones(req: ListTimeZonesRequest | None = None) -> list[str]:
|
||||
"""Retrieves a list of all valid IANA timezone names.
|
||||
|
||||
Returns:
|
||||
list[str]: A list of strings, where each string is a valid IANA timezone.
|
||||
"""
|
||||
return TimeTool.list_time_zones()
|
||||
if req is None:
|
||||
req = ListTimeZonesRequest()
|
||||
return list_time_zones(req)
|
||||
|
||||
return router
|
||||
|
||||
|
|
|
@ -32,22 +32,6 @@ class CurrentWeather(BaseModel):
|
|||
)
|
||||
|
||||
|
||||
class HourlyUnits(BaseModel):
|
||||
"""Represents the units for the hourly weather data.
|
||||
|
||||
Attributes:
|
||||
time: The unit for the time values (e.g., "iso8601").
|
||||
temperature_2m: The unit for the temperature values (e.g., "°C").
|
||||
relative_humidity_2m: The unit for the relative humidity values (e.g., "%").
|
||||
wind_speed_10m: The unit for the wind speed values (e.g., "km/h").
|
||||
"""
|
||||
|
||||
time: str
|
||||
temperature_2m: str
|
||||
relative_humidity_2m: str
|
||||
wind_speed_10m: str
|
||||
|
||||
|
||||
class HourlyData(BaseModel):
|
||||
"""Represents the hourly weather forecast data.
|
||||
|
||||
|
@ -64,6 +48,22 @@ class HourlyData(BaseModel):
|
|||
wind_speed_10m: list[float]
|
||||
|
||||
|
||||
class HourlyUnits(BaseModel):
|
||||
"""Represents the units for the hourly weather data.
|
||||
|
||||
Attributes:
|
||||
time: The unit for the time values (e.g. "iso8601").
|
||||
temperature_2m: The unit for the temperature values (e.g. "°C").
|
||||
relative_humidity_2m: The unit for the relative humidity values (e.g. "%").
|
||||
wind_speed_10m: The unit for the wind speed values (e.g. "km/h").
|
||||
"""
|
||||
|
||||
time: str
|
||||
temperature_2m: str
|
||||
relative_humidity_2m: str
|
||||
wind_speed_10m: str
|
||||
|
||||
|
||||
class WeatherForecastOutput(BaseModel):
|
||||
"""Represents the complete weather forecast output from the API.
|
||||
|
||||
|
@ -75,8 +75,8 @@ class WeatherForecastOutput(BaseModel):
|
|||
longitude: The longitude of the forecast location.
|
||||
generationtime_ms: The time taken to generate the forecast in milliseconds.
|
||||
utc_offset_seconds: The UTC offset in seconds.
|
||||
timezone: The timezone of the forecast location (e.g., "GMT").
|
||||
timezone_abbreviation: The abbreviated timezone name (e.g., "GMT").
|
||||
timezone: The timezone of the forecast location (e.g. "GMT").
|
||||
timezone_abbreviation: The abbreviated timezone name (e.g. "GMT").
|
||||
elevation: The elevation of the forecast location in metres.
|
||||
current: The current weather conditions.
|
||||
hourly_units: The units for the hourly forecast data.
|
||||
|
@ -93,3 +93,15 @@ class WeatherForecastOutput(BaseModel):
|
|||
current: CurrentWeather = Field(..., description="Current weather conditions")
|
||||
hourly_units: HourlyUnits
|
||||
hourly: HourlyData
|
||||
|
||||
|
||||
class WeatherForecastRequest(BaseModel):
|
||||
"""Represents a request for weather forecast.
|
||||
|
||||
Attributes:
|
||||
latitude: The latitude of the location (optional, will use default if not provided).
|
||||
longitude: The longitude of the location (optional, will use default if not provided).
|
||||
"""
|
||||
|
||||
latitude: float | None = Field(None, description="Latitude for the location (e.g. 52.52)")
|
||||
longitude: float | None = Field(None, description="Longitude for the location (e.g. 13.41)")
|
||||
|
|
123
openapi_mcp_server/tools/weather/responses.py
Normal file
123
openapi_mcp_server/tools/weather/responses.py
Normal file
|
@ -0,0 +1,123 @@
|
|||
"""This module contains response generation logic for the Weather tool.
|
||||
|
||||
It handles the core functionality of retrieving weather forecasts from the
|
||||
Open-Meteo API, including geocoding location names and determining temperature
|
||||
units based on geographical regions.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import requests
|
||||
import reverse_geocoder as rg # type: ignore[import-untyped]
|
||||
from fastapi import HTTPException
|
||||
from geopy.adapters import RequestsAdapter # type: ignore[import-untyped]
|
||||
from geopy.geocoders import Nominatim # type: ignore[import-untyped]
|
||||
|
||||
from openapi_mcp_server.core.config import get_app_config
|
||||
|
||||
from .models import WeatherForecastOutput, WeatherForecastRequest
|
||||
|
||||
OPEN_METEO_URL = "https://api.open-meteo.com/v1/forecast"
|
||||
FAHRENHEIT_COUNTRIES = {"US", "LR", "MM"}
|
||||
|
||||
|
||||
def geocode_location(location: str) -> tuple[float, float] | None:
|
||||
"""Converts a location name into geographical coordinates.
|
||||
|
||||
Args:
|
||||
location: The name of the location to geocode (e.g. "London, UK").
|
||||
|
||||
Returns:
|
||||
A tuple containing the latitude and longitude, or None if the
|
||||
location cannot be found.
|
||||
"""
|
||||
geolocator = Nominatim(user_agent="openapi-mcp-server", adapter_factory=RequestsAdapter)
|
||||
coords = None
|
||||
try:
|
||||
location_data = geolocator.geocode(location)
|
||||
if location_data:
|
||||
coords = float(location_data.latitude), float(location_data.longitude) # type: ignore[attr-defined]
|
||||
except Exception:
|
||||
pass # Geocoding failed, coords remains None
|
||||
return coords
|
||||
|
||||
|
||||
def get_weather_forecast(request: WeatherForecastRequest) -> WeatherForecastOutput:
|
||||
"""Retrieves the weather forecast for a given location.
|
||||
|
||||
This function fetches the current weather conditions and an hourly
|
||||
forecast. The temperature unit (Celsius or Fahrenheit) is determined
|
||||
automatically based on the location's country. If no coordinates
|
||||
are provided, it falls back to a default location if configured.
|
||||
|
||||
Args:
|
||||
request: A WeatherForecastRequest object containing the coordinates.
|
||||
|
||||
Returns:
|
||||
A WeatherForecastOutput object containing the current weather
|
||||
and forecast data.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the coordinates are missing and no default
|
||||
location is set, if the default location cannot be geocoded,
|
||||
or if there is an error communicating with the weather API.
|
||||
"""
|
||||
latitude = request.latitude
|
||||
longitude = request.longitude
|
||||
|
||||
# If coordinates not provided, try to use default location
|
||||
if latitude is None or longitude is None:
|
||||
default_location = get_app_config().default_location
|
||||
if default_location:
|
||||
coords = geocode_location(default_location)
|
||||
if coords:
|
||||
latitude, longitude = coords
|
||||
else:
|
||||
raise HTTPException(
|
||||
status_code=422,
|
||||
detail=f"Could not geocode default location: {default_location}",
|
||||
)
|
||||
else:
|
||||
raise HTTPException(
|
||||
status_code=422,
|
||||
detail="Latitude and longitude are required, "
|
||||
"and no default location is configured.",
|
||||
)
|
||||
# Determine temperature unit based on location
|
||||
try:
|
||||
geo_results = rg.search((latitude, longitude), mode=1)
|
||||
if geo_results:
|
||||
country_code = geo_results[0]["cc"]
|
||||
temperature_unit = "fahrenheit" if country_code in FAHRENHEIT_COUNTRIES else "celsius"
|
||||
else:
|
||||
temperature_unit = "celsius"
|
||||
except Exception:
|
||||
temperature_unit = "celsius"
|
||||
|
||||
params: dict[str, str | float] = {
|
||||
"latitude": latitude,
|
||||
"longitude": longitude,
|
||||
"current": "temperature_2m,wind_speed_10m",
|
||||
"hourly": "temperature_2m,relative_humidity_2m,wind_speed_10m",
|
||||
"timezone": "auto",
|
||||
"temperature_unit": temperature_unit,
|
||||
}
|
||||
|
||||
try:
|
||||
response = requests.get(OPEN_METEO_URL, params=params, timeout=30)
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
|
||||
if "current" not in data or "hourly" not in data:
|
||||
raise HTTPException(
|
||||
status_code=500, detail="Unexpected response format from Open-Meteo API"
|
||||
)
|
||||
|
||||
return WeatherForecastOutput(**data)
|
||||
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise HTTPException(
|
||||
status_code=503, detail=f"Error connecting to Open-Meteo API: {e}"
|
||||
) from e
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=f"An internal error occurred: {e}") from e
|
|
@ -8,21 +8,12 @@ default location and automatically selects the appropriate temperature unit
|
|||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Annotated
|
||||
from fastapi import APIRouter
|
||||
|
||||
import requests
|
||||
import reverse_geocoder as rg
|
||||
from fastapi import APIRouter, HTTPException, Query
|
||||
from geopy.adapters import RequestsAdapter
|
||||
from geopy.geocoders import Nominatim
|
||||
|
||||
from openapi_mcp_server.core.config import get_app_config
|
||||
from openapi_mcp_server.tools.base import BaseTool
|
||||
|
||||
from .models import WeatherForecastOutput
|
||||
|
||||
OPEN_METEO_URL = "https://api.open-meteo.com/v1/forecast"
|
||||
FAHRENHEIT_COUNTRIES = {"US", "LR", "MM"}
|
||||
from .models import WeatherForecastOutput, WeatherForecastRequest
|
||||
from .responses import get_weather_forecast
|
||||
|
||||
|
||||
class WeatherTool(BaseTool):
|
||||
|
@ -42,28 +33,6 @@ class WeatherTool(BaseTool):
|
|||
name="weather",
|
||||
description="Provides weather retrieval by latitude and longitude using Open-Meteo",
|
||||
)
|
||||
self.geolocator = Nominatim(
|
||||
user_agent="openapi-mcp-server", adapter_factory=RequestsAdapter
|
||||
)
|
||||
|
||||
def geocode_location(self, location: str) -> tuple[float, float] | None:
|
||||
"""Converts a location name into geographical coordinates.
|
||||
|
||||
Args:
|
||||
location: The name of the location to geocode (e.g., "London, UK").
|
||||
|
||||
Returns:
|
||||
A tuple containing the latitude and longitude, or None if the
|
||||
location cannot be found.
|
||||
"""
|
||||
coords = None
|
||||
try:
|
||||
location_data = self.geolocator.geocode(location)
|
||||
if location_data:
|
||||
coords = float(location_data.latitude), float(location_data.longitude) # type: ignore[attr-defined]
|
||||
except Exception:
|
||||
pass # Geocoding failed, coords remains None
|
||||
return coords
|
||||
|
||||
def get_router(self) -> APIRouter:
|
||||
"""Creates and returns the FastAPI router for the weather tool.
|
||||
|
@ -75,17 +44,12 @@ class WeatherTool(BaseTool):
|
|||
"""
|
||||
router = APIRouter()
|
||||
|
||||
@router.get(
|
||||
@router.post(
|
||||
"/forecast",
|
||||
response_model=WeatherForecastOutput,
|
||||
summary="Get current weather conditions and hourly forecast by coordinates",
|
||||
summary=("Get current weather and hourly forecast for any location by coordinates"),
|
||||
)
|
||||
def forecast(
|
||||
latitude: Annotated[float, Query(description="Latitude for the location (e.g. 52.52)")],
|
||||
longitude: Annotated[
|
||||
float, Query(description="Longitude for the location (e.g. 13.41)")
|
||||
],
|
||||
) -> WeatherForecastOutput:
|
||||
def post_forecast(request: WeatherForecastRequest) -> WeatherForecastOutput:
|
||||
"""Retrieves the weather forecast for a given location.
|
||||
|
||||
This endpoint fetches the current weather conditions and an hourly
|
||||
|
@ -93,79 +57,11 @@ class WeatherTool(BaseTool):
|
|||
automatically based on the location's country. If no coordinates
|
||||
are provided, it falls back to a default location if configured.
|
||||
|
||||
Args:
|
||||
latitude: The latitude of the location.
|
||||
longitude: The longitude of the location.
|
||||
|
||||
Returns:
|
||||
A WeatherForecastOutput object containing the current weather
|
||||
and forecast data.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the coordinates are missing and no default
|
||||
location is set, if the default location cannot be geocoded,
|
||||
or if there is an error communicating with the weather API.
|
||||
"""
|
||||
# If coordinates not provided, try to use default location
|
||||
if latitude is None or longitude is None:
|
||||
default_location = get_app_config().default_location
|
||||
if default_location:
|
||||
coords = self.geocode_location(default_location)
|
||||
if coords:
|
||||
latitude, longitude = coords
|
||||
else:
|
||||
raise HTTPException(
|
||||
status_code=422,
|
||||
detail=f"Could not geocode default location: {default_location}",
|
||||
)
|
||||
else:
|
||||
raise HTTPException(
|
||||
status_code=422,
|
||||
detail="Latitude and longitude are required, "
|
||||
"and no default location is configured.",
|
||||
)
|
||||
# Determine temperature unit based on location
|
||||
try:
|
||||
geo_results = rg.search((latitude, longitude), mode=1)
|
||||
if geo_results:
|
||||
country_code = geo_results[0]["cc"]
|
||||
temperature_unit = (
|
||||
"fahrenheit" if country_code in FAHRENHEIT_COUNTRIES else "celsius"
|
||||
)
|
||||
else:
|
||||
temperature_unit = "celsius"
|
||||
except Exception:
|
||||
temperature_unit = "celsius"
|
||||
|
||||
params = {
|
||||
"latitude": latitude,
|
||||
"longitude": longitude,
|
||||
"current": "temperature_2m,wind_speed_10m",
|
||||
"hourly": "temperature_2m,relative_humidity_2m,wind_speed_10m",
|
||||
"timezone": "auto",
|
||||
"temperature_unit": temperature_unit,
|
||||
}
|
||||
|
||||
try:
|
||||
response = requests.get(OPEN_METEO_URL, params=params, timeout=30)
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
|
||||
if "current" not in data or "hourly" not in data:
|
||||
raise HTTPException(
|
||||
status_code=500, detail="Unexpected response format from Open-Meteo API"
|
||||
)
|
||||
|
||||
return WeatherForecastOutput(**data)
|
||||
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise HTTPException(
|
||||
status_code=503, detail=f"Error connecting to Open-Meteo API: {e}"
|
||||
) from e
|
||||
except Exception as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"An internal error occurred: {e}"
|
||||
) from e
|
||||
return get_weather_forecast(request)
|
||||
|
||||
return router
|
||||
|
||||
|
|
|
@ -11,6 +11,32 @@ from __future__ import annotations
|
|||
from pydantic import BaseModel, Field, HttpUrl
|
||||
|
||||
|
||||
class WebRawRequest(BaseModel):
|
||||
"""Represents a request to fetch raw HTML content from a URL.
|
||||
|
||||
Attributes:
|
||||
url: The URL to fetch raw content and headers from.
|
||||
"""
|
||||
|
||||
url: HttpUrl = Field(..., description="URL to fetch raw content and headers from")
|
||||
|
||||
|
||||
class WebRawResponse(BaseModel):
|
||||
"""Represents the response with raw web content.
|
||||
|
||||
Attributes:
|
||||
url: The original URL that was fetched.
|
||||
status_code: The HTTP status code of the response.
|
||||
headers: The response headers.
|
||||
content: The raw HTML content of the response.
|
||||
"""
|
||||
|
||||
url: str = Field(..., description="Original URL that was fetched")
|
||||
status_code: int = Field(..., description="HTTP status code")
|
||||
headers: dict[str, str] = Field(..., description="Response headers")
|
||||
content: str = Field(..., description="Raw HTML content")
|
||||
|
||||
|
||||
class WebRequest(BaseModel):
|
||||
"""Represents a request to parse and extract content from a URL.
|
||||
|
||||
|
@ -46,19 +72,3 @@ class WebResponse(BaseModel):
|
|||
...,
|
||||
description="Extracted content in Markdown format with metadata frontmatter",
|
||||
)
|
||||
|
||||
|
||||
class WebRawResponse(BaseModel):
|
||||
"""Represents the response with raw web content.
|
||||
|
||||
Attributes:
|
||||
url: The original URL that was fetched.
|
||||
status_code: The HTTP status code of the response.
|
||||
headers: The response headers.
|
||||
content: The raw HTML content of the response.
|
||||
"""
|
||||
|
||||
url: str = Field(..., description="Original URL that was fetched")
|
||||
status_code: int = Field(..., description="HTTP status code")
|
||||
headers: dict[str, str] = Field(..., description="Response headers")
|
||||
content: str = Field(..., description="Raw HTML content")
|
||||
|
|
119
openapi_mcp_server/tools/web/responses.py
Normal file
119
openapi_mcp_server/tools/web/responses.py
Normal file
|
@ -0,0 +1,119 @@
|
|||
"""This module contains response generation logic for the Web tool.
|
||||
|
||||
It handles the core functionality of fetching and parsing web content using
|
||||
the trafilatura library, as well as fetching raw HTML content and headers.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import requests
|
||||
import trafilatura
|
||||
from fastapi import HTTPException
|
||||
|
||||
from openapi_mcp_server.core.config import get_app_config
|
||||
|
||||
from .models import WebRawResponse, WebResponse
|
||||
|
||||
|
||||
def parse_web_page(
|
||||
url: str,
|
||||
with_metadata: bool = True,
|
||||
include_formatting: bool = True,
|
||||
include_images: bool = True,
|
||||
include_links: bool = True,
|
||||
include_tables: bool = True,
|
||||
) -> WebResponse:
|
||||
"""Performs the actual web page parsing using the trafilatura library.
|
||||
|
||||
This function handles the fetching and extraction of content from a
|
||||
given URL based on the provided options.
|
||||
|
||||
Args:
|
||||
url: The URL of the web page to parse.
|
||||
with_metadata: Whether to include metadata in the extraction.
|
||||
include_formatting: Whether to keep formatting in the output.
|
||||
include_images: Whether to include images in the output.
|
||||
include_links: Whether to include links in the output.
|
||||
include_tables: Whether to include tables in the output.
|
||||
|
||||
Returns:
|
||||
A WebResponse object containing the parsed web content.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the URL cannot be fetched, content cannot be
|
||||
extracted, or another error occurs during parsing.
|
||||
"""
|
||||
try:
|
||||
config = get_app_config()
|
||||
headers = {"User-Agent": config.web_user_agent}
|
||||
response = requests.get(url, headers=headers, timeout=30)
|
||||
response.raise_for_status()
|
||||
downloaded = response.text
|
||||
if not downloaded:
|
||||
raise HTTPException(status_code=404, detail=f"Unable to fetch content from URL: {url}")
|
||||
|
||||
extracted_content = trafilatura.extract(
|
||||
downloaded,
|
||||
output_format="markdown",
|
||||
favor_recall=True,
|
||||
with_metadata=with_metadata,
|
||||
include_formatting=include_formatting,
|
||||
include_images=include_images,
|
||||
include_links=include_links,
|
||||
include_tables=include_tables,
|
||||
)
|
||||
|
||||
if not extracted_content:
|
||||
raise HTTPException(
|
||||
status_code=422, detail=f"Unable to extract content from URL: {url}"
|
||||
)
|
||||
|
||||
return WebResponse(
|
||||
url=url,
|
||||
content=extracted_content,
|
||||
)
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=f"Error parsing web page: {e!s}") from e
|
||||
|
||||
|
||||
def fetch_raw_content(url: str) -> WebRawResponse:
|
||||
"""Fetches the raw HTML content and headers from a given URL.
|
||||
|
||||
This function is useful for retrieving the original, unprocessed content of
|
||||
a web page.
|
||||
|
||||
Args:
|
||||
url: The URL to fetch raw content from.
|
||||
|
||||
Returns:
|
||||
A WebRawResponse object containing the raw HTML content, status code,
|
||||
and headers.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the request fails or an internal error occurs.
|
||||
"""
|
||||
try:
|
||||
config = get_app_config()
|
||||
headers = {"User-Agent": config.web_user_agent}
|
||||
response = requests.get(url, headers=headers, timeout=30)
|
||||
response.raise_for_status()
|
||||
|
||||
headers = dict(response.headers)
|
||||
|
||||
return WebRawResponse(
|
||||
url=url,
|
||||
status_code=response.status_code,
|
||||
headers=headers,
|
||||
content=response.text,
|
||||
)
|
||||
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise HTTPException(
|
||||
status_code=503,
|
||||
detail=f"Unable to fetch content from URL: {e!s}",
|
||||
) from e
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=f"Error fetching raw content: {e!s}") from e
|
|
@ -9,17 +9,12 @@ web page.
|
|||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Annotated
|
||||
from fastapi import APIRouter
|
||||
|
||||
import requests
|
||||
import trafilatura
|
||||
from fastapi import APIRouter, HTTPException, Query
|
||||
from pydantic import HttpUrl # noqa: TC002
|
||||
|
||||
from openapi_mcp_server.core.config import get_app_config
|
||||
from openapi_mcp_server.tools.base import BaseTool
|
||||
|
||||
from .models import WebRawResponse, WebRequest, WebResponse
|
||||
from .models import WebRawRequest, WebRawResponse, WebRequest, WebResponse
|
||||
from .responses import fetch_raw_content, parse_web_page
|
||||
|
||||
|
||||
class WebTool(BaseTool):
|
||||
|
@ -51,23 +46,24 @@ class WebTool(BaseTool):
|
|||
router = APIRouter()
|
||||
|
||||
@router.post(
|
||||
"/web_read",
|
||||
"/read",
|
||||
response_model=WebResponse,
|
||||
summary="Extract and parse webpage content into clean markdown",
|
||||
summary=(
|
||||
"Extract clean webpage content as markdown, removing ads and navigation. Use for "
|
||||
"reading articles, documentation, or any webpage content that needs analysis or "
|
||||
"summarisation"
|
||||
),
|
||||
)
|
||||
def read(request: WebRequest) -> WebResponse:
|
||||
def post_read(request: WebRequest) -> WebResponse:
|
||||
"""Extracts and parses the content of a web page into clean Markdown.
|
||||
|
||||
This endpoint takes a URL and a set of options to control the parsing
|
||||
process. It returns the extracted content in a structured format.
|
||||
|
||||
Args:
|
||||
request: A WebRequest object containing the URL and parsing options.
|
||||
|
||||
Returns:
|
||||
A WebResponse object containing the parsed web content.
|
||||
"""
|
||||
return WebTool._parse_web_page(
|
||||
return parse_web_page(
|
||||
str(request.url),
|
||||
request.with_metadata or True,
|
||||
request.include_formatting or True,
|
||||
|
@ -76,126 +72,29 @@ class WebTool(BaseTool):
|
|||
request.include_tables or True,
|
||||
)
|
||||
|
||||
@router.get(
|
||||
"/web_raw",
|
||||
@router.post(
|
||||
"/raw",
|
||||
response_model=WebRawResponse,
|
||||
summary="Fetch raw HTML content and headers from any URL",
|
||||
summary=(
|
||||
"Fetch raw HTML source and HTTP headers. Use for technical analysis, accessing "
|
||||
"non-HTML content (text/JSON files), or when you need the complete unprocessed "
|
||||
"page structure, e.g. when the read tool's content seems incomplete"
|
||||
),
|
||||
)
|
||||
def raw(
|
||||
url: Annotated[HttpUrl, Query(description="URL to fetch raw content and headers from")],
|
||||
) -> WebRawResponse:
|
||||
def post_raw(request: WebRawRequest) -> WebRawResponse:
|
||||
"""Fetches the raw HTML content and headers from a given URL.
|
||||
|
||||
This endpoint is useful for retrieving the original, unprocessed content of
|
||||
a web page.
|
||||
|
||||
Args:
|
||||
url: The URL from which to fetch the raw content.
|
||||
|
||||
Returns:
|
||||
A WebRawResponse object containing the raw HTML content, status code,
|
||||
and headers.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the URL is missing, the request fails, or an
|
||||
internal error occurs.
|
||||
"""
|
||||
if not url:
|
||||
raise HTTPException(status_code=422, detail="URL query parameter is required.")
|
||||
|
||||
try:
|
||||
config = get_app_config()
|
||||
headers = {"User-Agent": config.web_user_agent}
|
||||
response = requests.get(str(url), headers=headers, timeout=30)
|
||||
response.raise_for_status()
|
||||
|
||||
headers = dict(response.headers)
|
||||
|
||||
return WebRawResponse(
|
||||
url=str(url),
|
||||
status_code=response.status_code,
|
||||
headers=headers,
|
||||
content=response.text,
|
||||
)
|
||||
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise HTTPException(
|
||||
status_code=503,
|
||||
detail=f"Unable to fetch content from URL: {e!s}",
|
||||
) from e
|
||||
except Exception as e:
|
||||
raise HTTPException(
|
||||
status_code=500, detail=f"Error fetching raw content: {e!s}"
|
||||
) from e
|
||||
return fetch_raw_content(str(request.url))
|
||||
|
||||
return router
|
||||
|
||||
@staticmethod
|
||||
def _parse_web_page(
|
||||
url: str,
|
||||
with_metadata: bool = True,
|
||||
include_formatting: bool = True,
|
||||
include_images: bool = True,
|
||||
include_links: bool = True,
|
||||
include_tables: bool = True,
|
||||
) -> WebResponse:
|
||||
"""Performs the actual web page parsing using the trafilatura library.
|
||||
|
||||
This static method handles the fetching and extraction of content from a
|
||||
given URL based on the provided options.
|
||||
|
||||
Args:
|
||||
url: The URL of the web page to parse.
|
||||
with_metadata: Whether to include metadata in the extraction.
|
||||
include_formatting: Whether to keep formatting in the output.
|
||||
include_images: Whether to include images in the output.
|
||||
include_links: Whether to include links in the output.
|
||||
include_tables: Whether to include tables in the output.
|
||||
|
||||
Returns:
|
||||
A WebResponse object containing the parsed web content.
|
||||
|
||||
Raises:
|
||||
HTTPException: If the URL cannot be fetched, content cannot be
|
||||
extracted, or another error occurs during parsing.
|
||||
"""
|
||||
try:
|
||||
config = get_app_config()
|
||||
headers = {"User-Agent": config.web_user_agent}
|
||||
response = requests.get(url, headers=headers, timeout=30)
|
||||
response.raise_for_status()
|
||||
downloaded = response.text
|
||||
if not downloaded:
|
||||
raise HTTPException(
|
||||
status_code=404, detail=f"Unable to fetch content from URL: {url}"
|
||||
)
|
||||
|
||||
extracted_content = trafilatura.extract(
|
||||
downloaded,
|
||||
output_format="markdown",
|
||||
favor_recall=True,
|
||||
with_metadata=with_metadata,
|
||||
include_formatting=include_formatting,
|
||||
include_images=include_images,
|
||||
include_links=include_links,
|
||||
include_tables=include_tables,
|
||||
)
|
||||
|
||||
if not extracted_content:
|
||||
raise HTTPException(
|
||||
status_code=422, detail=f"Unable to extract content from URL: {url}"
|
||||
)
|
||||
|
||||
return WebResponse(
|
||||
url=url,
|
||||
content=extracted_content,
|
||||
)
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=f"Error parsing web page: {e!s}") from e
|
||||
|
||||
|
||||
# Create the tool instance that will be discovered by the registry
|
||||
tool = WebTool()
|
||||
|
|
|
@ -37,13 +37,18 @@ Homepage = "https://git.tomfos.tr/tom/openapi-mcp-server"
|
|||
|
||||
[dependency-groups]
|
||||
dev = [
|
||||
"ruff>=0",
|
||||
"httpx>=0",
|
||||
"mypy>=1",
|
||||
"pytest>=8",
|
||||
"pytest-asyncio>=0",
|
||||
"pytest-cov>=0",
|
||||
"pytest-mock>=0",
|
||||
"httpx>=0",
|
||||
"respx>=0",
|
||||
"ruff>=0",
|
||||
"types-requests>=2",
|
||||
"types-python-dateutil>=2",
|
||||
"types-pytz>=2025",
|
||||
"types-pyyaml>=6",
|
||||
]
|
||||
|
||||
[tool.uv]
|
||||
|
|
|
@ -1 +1,3 @@
|
|||
"""Test package for the OpenAPI MCP Server."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
|
|
@ -1,3 +1,5 @@
|
|||
"""Test configuration and fixtures for the OpenAPI MCP Server."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import contextlib
|
||||
|
@ -11,6 +13,7 @@ import pytest
|
|||
import yaml
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
from openapi_mcp_server.core.config import AppConfig, get_app_config
|
||||
from openapi_mcp_server.server import create_app
|
||||
|
||||
if TYPE_CHECKING:
|
||||
|
@ -20,7 +23,7 @@ if TYPE_CHECKING:
|
|||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def test_config(temp_memory_file):
|
||||
def test_config(temp_memory_file: str) -> Generator[None]:
|
||||
"""Override configuration values for testing."""
|
||||
original_env = os.environ.copy()
|
||||
|
||||
|
@ -54,15 +57,12 @@ def test_config(temp_memory_file):
|
|||
original_config_path = Path("config.yaml")
|
||||
config_backup = None
|
||||
if original_config_path.exists():
|
||||
config_backup = original_config_path.read_text()
|
||||
config_backup = original_config_path.read_text(encoding="utf-8")
|
||||
original_config_path.unlink()
|
||||
|
||||
test_config_path.rename("config.yaml")
|
||||
|
||||
# Force re-initialization of AppConfig after config.yaml is in place
|
||||
from openapi_mcp_server.core.config import AppConfig, get_app_config
|
||||
|
||||
AppConfig._instance = None
|
||||
AppConfig._instance = None # type: ignore[misc]
|
||||
get_app_config()
|
||||
|
||||
yield
|
||||
|
@ -75,29 +75,41 @@ def test_config(temp_memory_file):
|
|||
if Path("config.yaml").exists():
|
||||
Path("config.yaml").unlink()
|
||||
if config_backup:
|
||||
original_config_path.write_text(config_backup)
|
||||
original_config_path.write_text(config_backup, encoding="utf-8")
|
||||
|
||||
# Clean up test memory file
|
||||
test_memory_path = Path("/tmp/test_memory.json")
|
||||
test_memory_path = Path(tempfile.gettempdir()) / "test_memory.json"
|
||||
if test_memory_path.exists():
|
||||
test_memory_path.unlink()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def app() -> FastAPI:
|
||||
"""Create FastAPI app instance for testing."""
|
||||
"""Create FastAPI app instance for testing.
|
||||
|
||||
Returns:
|
||||
FastAPI application instance.
|
||||
"""
|
||||
return create_app()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client(app: FastAPI) -> TestClient:
|
||||
"""Create test client for API calls."""
|
||||
"""Create test client for API calls.
|
||||
|
||||
Returns:
|
||||
TestClient instance for making API calls.
|
||||
"""
|
||||
return TestClient(app)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_httpx_client():
|
||||
"""Mock httpx.AsyncClient for external API calls."""
|
||||
def mock_httpx_client() -> Mock:
|
||||
"""Mock httpx.AsyncClient for external API calls.
|
||||
|
||||
Returns:
|
||||
Mock object configured for httpx requests.
|
||||
"""
|
||||
mock_client = Mock()
|
||||
mock_response = Mock()
|
||||
mock_response.status_code = 200
|
||||
|
@ -112,8 +124,12 @@ def mock_httpx_client():
|
|||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_requests():
|
||||
"""Mock requests module for external API calls."""
|
||||
def mock_requests() -> Mock:
|
||||
"""Mock requests module for external API calls.
|
||||
|
||||
Returns:
|
||||
Mock object configured for requests.
|
||||
"""
|
||||
mock_response = Mock()
|
||||
mock_response.status_code = 200
|
||||
mock_response.json.return_value = {"message": "mocked response"}
|
||||
|
@ -126,7 +142,11 @@ def mock_requests():
|
|||
|
||||
@pytest.fixture
|
||||
def temp_memory_file() -> Generator[str]:
|
||||
"""Create temporary file for memory storage testing."""
|
||||
"""Create temporary file for memory storage testing.
|
||||
|
||||
Yields:
|
||||
Path to temporary JSON file.
|
||||
"""
|
||||
with tempfile.NamedTemporaryFile(encoding="utf-8", mode="w", suffix=".json", delete=False) as f:
|
||||
temp_file = f.name
|
||||
|
||||
|
@ -134,12 +154,16 @@ def temp_memory_file() -> Generator[str]:
|
|||
|
||||
# Clean up
|
||||
with contextlib.suppress(FileNotFoundError):
|
||||
os.unlink(temp_file)
|
||||
Path(temp_file).unlink()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def sample_memory_data() -> list[dict[str, Any]]:
|
||||
"""Sample memory data for testing."""
|
||||
"""Sample memory data for testing.
|
||||
|
||||
Returns:
|
||||
List of sample memory entries.
|
||||
"""
|
||||
return [
|
||||
{
|
||||
"id": "1",
|
||||
|
@ -164,7 +188,11 @@ def sample_memory_data() -> list[dict[str, Any]]:
|
|||
|
||||
@pytest.fixture
|
||||
def sample_weather_data() -> dict[str, Any]:
|
||||
"""Sample weather API response for testing."""
|
||||
"""Sample weather API response for testing.
|
||||
|
||||
Returns:
|
||||
Dictionary with weather API response data.
|
||||
"""
|
||||
return {
|
||||
"latitude": 52.52,
|
||||
"longitude": 13.41,
|
||||
|
@ -201,7 +229,11 @@ def sample_weather_data() -> dict[str, Any]:
|
|||
|
||||
@pytest.fixture
|
||||
def sample_searxng_data() -> dict[str, Any]:
|
||||
"""Sample SearXNG API response for testing."""
|
||||
"""Sample SearXNG API response for testing.
|
||||
|
||||
Returns:
|
||||
Dictionary with SearXNG API response data.
|
||||
"""
|
||||
return {
|
||||
"query": "test query",
|
||||
"number_of_results": 2,
|
||||
|
@ -230,7 +262,11 @@ def sample_searxng_data() -> dict[str, Any]:
|
|||
|
||||
@pytest.fixture
|
||||
def sample_forgejo_data() -> dict[str, Any]:
|
||||
"""Sample Forgejo API response data for testing."""
|
||||
"""Sample Forgejo API response data for testing.
|
||||
|
||||
Returns:
|
||||
Dictionary with Forgejo API response data.
|
||||
"""
|
||||
return {
|
||||
"version": "1.21.0",
|
||||
"repositories": [
|
||||
|
|
|
@ -1 +1,3 @@
|
|||
"""End-to-end tests package for the OpenAPI MCP Server."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
|
2
tests/fixtures/__init__.py
vendored
2
tests/fixtures/__init__.py
vendored
|
@ -1 +1,3 @@
|
|||
"""Test fixtures package for the OpenAPI MCP Server."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
|
15
tests/fixtures/mock_responses.py
vendored
15
tests/fixtures/mock_responses.py
vendored
|
@ -1,3 +1,9 @@
|
|||
"""Mock response data for testing API endpoints.
|
||||
|
||||
This module provides mock response data for various external APIs used by the
|
||||
OpenAPI MCP server tools, including weather, search, and repository services.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any
|
||||
|
@ -8,6 +14,7 @@ class MockWeatherResponse:
|
|||
|
||||
@staticmethod
|
||||
def success_response() -> dict[str, Any]:
|
||||
"""Return a successful weather API response with sample data."""
|
||||
return {
|
||||
"latitude": 52.52,
|
||||
"longitude": 13.41,
|
||||
|
@ -47,6 +54,7 @@ class MockSearXNGResponse:
|
|||
|
||||
@staticmethod
|
||||
def search_response() -> dict[str, Any]:
|
||||
"""Return a successful search API response with sample results."""
|
||||
return {
|
||||
"query": "test query",
|
||||
"number_of_results": 2,
|
||||
|
@ -74,6 +82,7 @@ class MockSearXNGResponse:
|
|||
|
||||
@staticmethod
|
||||
def categories_response() -> dict[str, Any]:
|
||||
"""Return available search categories response."""
|
||||
return {
|
||||
"categories": [
|
||||
"general",
|
||||
|
@ -88,6 +97,7 @@ class MockSearXNGResponse:
|
|||
|
||||
@staticmethod
|
||||
def engines_response() -> dict[str, Any]:
|
||||
"""Return available search engines response."""
|
||||
return {
|
||||
"engines": [
|
||||
{"name": "google", "categories": ["general"]},
|
||||
|
@ -102,10 +112,12 @@ class MockForgejoResponse:
|
|||
|
||||
@staticmethod
|
||||
def version_response() -> dict[str, Any]:
|
||||
"""Return Forgejo version information."""
|
||||
return {"version": "1.21.0"}
|
||||
|
||||
@staticmethod
|
||||
def repositories_response() -> list[dict[str, Any]]:
|
||||
"""Return list of repositories from Forgejo API."""
|
||||
return [
|
||||
{
|
||||
"id": 1,
|
||||
|
@ -132,6 +144,7 @@ class MockForgejoResponse:
|
|||
|
||||
@staticmethod
|
||||
def branches_response() -> list[dict[str, Any]]:
|
||||
"""Return list of branches from Forgejo API."""
|
||||
return [
|
||||
{
|
||||
"name": "main",
|
||||
|
@ -161,10 +174,12 @@ class MockWebResponse:
|
|||
|
||||
@staticmethod
|
||||
def trafilatura_response() -> str:
|
||||
"""Return extracted content from a webpage."""
|
||||
return "# Test Article\n\nThis is a test article extracted from a webpage."
|
||||
|
||||
@staticmethod
|
||||
def raw_html_response() -> str:
|
||||
"""Return raw HTML content for testing."""
|
||||
return """
|
||||
<html>
|
||||
<head><title>Test Page</title></head>
|
||||
|
|
58
tests/fixtures/test_data.py
vendored
58
tests/fixtures/test_data.py
vendored
|
@ -1,10 +1,18 @@
|
|||
"""Test data constants for API endpoint testing.
|
||||
|
||||
This module provides structured test data for various tools including
|
||||
time zones, timestamps, memory entries, and other testing fixtures.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any, ClassVar
|
||||
|
||||
|
||||
class TimeTestData:
|
||||
"""Test data for time tool tests."""
|
||||
|
||||
VALID_TIMEZONES = [
|
||||
VALID_TIMEZONES: ClassVar[list[str]] = [
|
||||
"UTC",
|
||||
"Europe/London",
|
||||
"America/New_York",
|
||||
|
@ -12,7 +20,7 @@ class TimeTestData:
|
|||
"Australia/Sydney",
|
||||
]
|
||||
|
||||
INVALID_TIMEZONES = [
|
||||
INVALID_TIMEZONES: ClassVar[list[str]] = [
|
||||
"Invalid/Timezone",
|
||||
"UTC+5",
|
||||
"",
|
||||
|
@ -20,19 +28,19 @@ class TimeTestData:
|
|||
"NotATimezone",
|
||||
]
|
||||
|
||||
UNIX_TIMESTAMPS = [
|
||||
UNIX_TIMESTAMPS: ClassVar[list[int]] = [
|
||||
1704067200, # 2024-01-01 00:00:00 UTC
|
||||
1704153600, # 2024-01-02 00:00:00 UTC
|
||||
1704240000, # 2024-01-03 00:00:00 UTC
|
||||
]
|
||||
|
||||
ISO_TIMESTAMPS = [
|
||||
ISO_TIMESTAMPS: ClassVar[list[str]] = [
|
||||
"2024-01-01T00:00:00Z",
|
||||
"2024-01-02T00:00:00Z",
|
||||
"2024-01-03T00:00:00Z",
|
||||
]
|
||||
|
||||
HUMAN_TIMESTAMPS = [
|
||||
HUMAN_TIMESTAMPS: ClassVar[list[str]] = [
|
||||
"2024-01-01 12:00:00",
|
||||
"January 1, 2024 12:00 PM",
|
||||
"2024-01-01T12:00:00",
|
||||
|
@ -43,7 +51,7 @@ class TimeTestData:
|
|||
class MemoryTestData:
|
||||
"""Test data for memory tool tests."""
|
||||
|
||||
SAMPLE_MEMORIES = [
|
||||
SAMPLE_MEMORIES: ClassVar[list[dict[str, Any]]] = [
|
||||
{
|
||||
"content": "John likes pizza",
|
||||
"entities": ["John"],
|
||||
|
@ -62,7 +70,7 @@ class MemoryTestData:
|
|||
},
|
||||
]
|
||||
|
||||
SEARCH_QUERIES = [
|
||||
SEARCH_QUERIES: ClassVar[list[dict[str, Any]]] = [
|
||||
{"keywords": ["pizza"], "expected_count": 1},
|
||||
{"keywords": ["Google"], "expected_count": 1},
|
||||
{"keywords": ["Python"], "expected_count": 1},
|
||||
|
@ -71,7 +79,7 @@ class MemoryTestData:
|
|||
{"keywords": ["nonexistent"], "expected_count": 0},
|
||||
]
|
||||
|
||||
ENTITY_QUERIES = [
|
||||
ENTITY_QUERIES: ClassVar[list[dict[str, Any]]] = [
|
||||
{"entities": ["John"], "expected_count": 1},
|
||||
{"entities": ["Alice"], "expected_count": 1},
|
||||
{"entities": ["Bob"], "expected_count": 1},
|
||||
|
@ -83,20 +91,20 @@ class MemoryTestData:
|
|||
class WeatherTestData:
|
||||
"""Test data for weather tool tests."""
|
||||
|
||||
VALID_COORDINATES = [
|
||||
VALID_COORDINATES: ClassVar[list[dict[str, float]]] = [
|
||||
{"latitude": 52.52, "longitude": 13.41}, # Berlin
|
||||
{"latitude": 51.5074, "longitude": -0.1278}, # London
|
||||
{"latitude": 40.7128, "longitude": -74.0060}, # New York
|
||||
]
|
||||
|
||||
INVALID_COORDINATES = [
|
||||
INVALID_COORDINATES: ClassVar[list[dict[str, float]]] = [
|
||||
{"latitude": 91.0, "longitude": 0.0}, # Invalid latitude
|
||||
{"latitude": 0.0, "longitude": 181.0}, # Invalid longitude
|
||||
{"latitude": -91.0, "longitude": 0.0}, # Invalid latitude
|
||||
{"latitude": 0.0, "longitude": -181.0}, # Invalid longitude
|
||||
]
|
||||
|
||||
GEOCODING_LOCATIONS = [
|
||||
GEOCODING_LOCATIONS: ClassVar[list[dict[str, Any]]] = [
|
||||
{"location": "London, UK", "expected_lat": 51.5074, "expected_lon": -0.1278},
|
||||
{"location": "New York, NY", "expected_lat": 40.7128, "expected_lon": -74.0060},
|
||||
{"location": "Berlin, Germany", "expected_lat": 52.52, "expected_lon": 13.41},
|
||||
|
@ -106,13 +114,13 @@ class WeatherTestData:
|
|||
class WebTestData:
|
||||
"""Test data for web tool tests."""
|
||||
|
||||
VALID_URLS = [
|
||||
VALID_URLS: ClassVar[list[str]] = [
|
||||
"https://example.com",
|
||||
"https://httpbin.org/html",
|
||||
"https://www.github.com",
|
||||
]
|
||||
|
||||
INVALID_URLS = [
|
||||
INVALID_URLS: ClassVar[list[str]] = [
|
||||
"not-a-url",
|
||||
"ftp://example.com",
|
||||
"https://",
|
||||
|
@ -120,7 +128,7 @@ class WebTestData:
|
|||
"https://nonexistent-domain-12345.com",
|
||||
]
|
||||
|
||||
EXPECTED_CONTENT_TYPES = [
|
||||
EXPECTED_CONTENT_TYPES: ClassVar[list[str]] = [
|
||||
"text/html",
|
||||
"application/json",
|
||||
"text/plain",
|
||||
|
@ -130,18 +138,18 @@ class WebTestData:
|
|||
class SearXNGTestData:
|
||||
"""Test data for SearXNG tool tests."""
|
||||
|
||||
SEARCH_QUERIES = [
|
||||
SEARCH_QUERIES: ClassVar[list[dict[str, str]]] = [
|
||||
{"query": "python programming", "category": "general"},
|
||||
{"query": "machine learning", "category": "it"},
|
||||
{"query": "climate change", "category": "science"},
|
||||
]
|
||||
|
||||
INVALID_QUERIES = [
|
||||
INVALID_QUERIES: ClassVar[list[dict[str, str]]] = [
|
||||
{"query": "", "category": "general"},
|
||||
{"query": "test", "category": "invalid_category"},
|
||||
]
|
||||
|
||||
CATEGORIES = [
|
||||
CATEGORIES: ClassVar[list[str]] = [
|
||||
"general",
|
||||
"images",
|
||||
"news",
|
||||
|
@ -151,7 +159,7 @@ class SearXNGTestData:
|
|||
"science",
|
||||
]
|
||||
|
||||
ENGINES = [
|
||||
ENGINES: ClassVar[list[str]] = [
|
||||
"google",
|
||||
"bing",
|
||||
"duckduckgo",
|
||||
|
@ -163,44 +171,44 @@ class SearXNGTestData:
|
|||
class ForgejoTestData:
|
||||
"""Test data for Forgejo tool tests."""
|
||||
|
||||
REPOSITORIES = [
|
||||
REPOSITORIES: ClassVar[list[dict[str, str]]] = [
|
||||
{"owner": "testuser", "repo": "test-repo"},
|
||||
{"owner": "orgname", "repo": "project-repo"},
|
||||
]
|
||||
|
||||
BRANCHES = [
|
||||
BRANCHES: ClassVar[list[str]] = [
|
||||
"main",
|
||||
"develop",
|
||||
"feature/new-feature",
|
||||
"hotfix/urgent-fix",
|
||||
]
|
||||
|
||||
FILE_PATHS = [
|
||||
FILE_PATHS: ClassVar[list[str]] = [
|
||||
"README.md",
|
||||
"src/main.py",
|
||||
"docs/api.md",
|
||||
"tests/test_main.py",
|
||||
]
|
||||
|
||||
COMMIT_SHAS = [
|
||||
COMMIT_SHAS: ClassVar[list[str]] = [
|
||||
"abc123def456",
|
||||
"def456ghi789",
|
||||
"ghi789jkl012",
|
||||
]
|
||||
|
||||
WORKFLOW_RUNS = [
|
||||
WORKFLOW_RUNS: ClassVar[list[dict[str, Any]]] = [
|
||||
{"id": 1, "status": "completed", "conclusion": "success"},
|
||||
{"id": 2, "status": "in_progress", "conclusion": None},
|
||||
{"id": 3, "status": "completed", "conclusion": "failure"},
|
||||
]
|
||||
|
||||
ISSUES = [
|
||||
ISSUES: ClassVar[list[dict[str, Any]]] = [
|
||||
{"index": 1, "title": "Bug in authentication", "state": "open"},
|
||||
{"index": 2, "title": "Feature request", "state": "closed"},
|
||||
{"index": 3, "title": "Documentation update", "state": "open"},
|
||||
]
|
||||
|
||||
PULL_REQUESTS = [
|
||||
PULL_REQUESTS: ClassVar[list[dict[str, Any]]] = [
|
||||
{"index": 1, "title": "Fix authentication bug", "state": "open"},
|
||||
{"index": 2, "title": "Add new feature", "state": "merged"},
|
||||
{"index": 3, "title": "Update documentation", "state": "closed"},
|
||||
|
|
|
@ -1 +1,3 @@
|
|||
"""Integration tests package for the OpenAPI MCP Server."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
|
File diff suppressed because it is too large
Load diff
|
@ -14,7 +14,9 @@ from openapi_mcp_server.server import app
|
|||
from openapi_mcp_server.tools.memory.models import (
|
||||
CreateMemoryRequest,
|
||||
DeleteMemoryRequest,
|
||||
GetAllMemoriesRequest,
|
||||
GetEntityRequest,
|
||||
GetMemorySummaryRequest,
|
||||
SearchMemoryRequest,
|
||||
)
|
||||
|
||||
|
@ -36,8 +38,7 @@ def test_create_memory() -> None:
|
|||
assert "timestamp" in response_json
|
||||
assert response_json["content"] == "This is a test memory."
|
||||
assert len(response_json["entities"]) == 1
|
||||
assert response_json["entities"][0]["name"] == "test_entity"
|
||||
assert response_json["entities"][0]["entity_type"] == "generic"
|
||||
assert response_json["entities"][0] == "test_entity"
|
||||
|
||||
|
||||
def test_get_all_memories() -> None:
|
||||
|
@ -47,7 +48,8 @@ def test_get_all_memories() -> None:
|
|||
It checks for a 200 OK status code and that the response contains a list of memories
|
||||
and entities within a MemoryGraph structure.
|
||||
"""
|
||||
response = client.get("/memory/all")
|
||||
request_data = GetAllMemoriesRequest(limit=20)
|
||||
response = client.post("/memory/all", json=request_data.model_dump())
|
||||
assert response.status_code == 200
|
||||
assert "memories" in response.json()
|
||||
assert "entities" in response.json()
|
||||
|
@ -108,7 +110,8 @@ def test_get_memory_summary() -> None:
|
|||
It checks for a 200 OK status code and that the response contains expected summary fields
|
||||
like total_memories, total_entities, etc.
|
||||
"""
|
||||
response = client.get("/memory/stats")
|
||||
request_data = GetMemorySummaryRequest()
|
||||
response = client.post("/memory/stats", json=request_data.model_dump())
|
||||
assert response.status_code == 200
|
||||
assert "total_memories" in response.json()
|
||||
assert "total_entities" in response.json()
|
||||
|
|
|
@ -8,18 +8,18 @@ error conditions. Mocking is used to prevent actual external network calls to th
|
|||
|
||||
from __future__ import annotations
|
||||
|
||||
from unittest.mock import patch
|
||||
from unittest.mock import MagicMock, patch
|
||||
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
from openapi_mcp_server.server import app
|
||||
from openapi_mcp_server.tools.searxng.models import SearchRequest
|
||||
from openapi_mcp_server.tools.searxng.models import CategoriesRequest, EnginesRequest, SearchRequest
|
||||
|
||||
client = TestClient(app)
|
||||
|
||||
|
||||
@patch("requests.get")
|
||||
def test_search_web(mock_get) -> None:
|
||||
def test_search_web(mock_get: MagicMock) -> None:
|
||||
"""Test the /searxng/search endpoint.
|
||||
|
||||
This test verifies that the endpoint correctly performs a web search and returns results.
|
||||
|
@ -28,6 +28,7 @@ def test_search_web(mock_get) -> None:
|
|||
presence of key fields in the search response.
|
||||
"""
|
||||
mock_get.return_value.status_code = 200
|
||||
mock_get.return_value.raise_for_status.return_value = None
|
||||
mock_get.return_value.json.return_value = {
|
||||
"query": "test query",
|
||||
"number_of_results": 1,
|
||||
|
@ -45,7 +46,9 @@ def test_search_web(mock_get) -> None:
|
|||
"engines": [],
|
||||
}
|
||||
|
||||
request_data = SearchRequest(query="test query")
|
||||
request_data = SearchRequest(
|
||||
query="test query", categories=None, engines=None, language="en", format="json", pageno=1
|
||||
)
|
||||
response = client.post("/searxng/search", json=request_data.model_dump())
|
||||
|
||||
assert response.status_code == 200
|
||||
|
@ -55,7 +58,7 @@ def test_search_web(mock_get) -> None:
|
|||
|
||||
|
||||
@patch("requests.get")
|
||||
def test_get_categories(mock_get) -> None:
|
||||
def test_get_categories(mock_get: MagicMock) -> None:
|
||||
"""Test the /searxng/categories endpoint.
|
||||
|
||||
This test verifies that the endpoint correctly retrieves available search categories.
|
||||
|
@ -63,9 +66,11 @@ def test_get_categories(mock_get) -> None:
|
|||
a 200 OK status code and the presence of categories in the response.
|
||||
"""
|
||||
mock_get.return_value.status_code = 200
|
||||
mock_get.return_value.raise_for_status.return_value = None
|
||||
mock_get.return_value.json.return_value = {"categories": ["general", "images"]}
|
||||
|
||||
response = client.get("/searxng/categories")
|
||||
request_data = CategoriesRequest()
|
||||
response = client.post("/searxng/categories", json=request_data.model_dump())
|
||||
|
||||
assert response.status_code == 200
|
||||
assert "categories" in response.json()
|
||||
|
@ -74,7 +79,7 @@ def test_get_categories(mock_get) -> None:
|
|||
|
||||
|
||||
@patch("requests.get")
|
||||
def test_get_engines(mock_get) -> None:
|
||||
def test_get_engines(mock_get: MagicMock) -> None:
|
||||
"""Test the /searxng/engines endpoint.
|
||||
|
||||
This test verifies that the endpoint correctly retrieves available search engines.
|
||||
|
@ -82,9 +87,11 @@ def test_get_engines(mock_get) -> None:
|
|||
a 200 OK status code and the presence of engines in the response.
|
||||
"""
|
||||
mock_get.return_value.status_code = 200
|
||||
mock_get.return_value.raise_for_status.return_value = None
|
||||
mock_get.return_value.json.return_value = {"engines": [{"name": "Google"}, {"name": "Bing"}]}
|
||||
|
||||
response = client.get("/searxng/engines")
|
||||
request_data = EnginesRequest()
|
||||
response = client.post("/searxng/engines", json=request_data.model_dump())
|
||||
|
||||
assert response.status_code == 200
|
||||
assert "engines" in response.json()
|
||||
|
|
|
@ -15,6 +15,8 @@ from openapi_mcp_server.server import app
|
|||
from openapi_mcp_server.tools.time.models import (
|
||||
ConvertTimeInput,
|
||||
ElapsedTimeInput,
|
||||
GetTimeInput,
|
||||
ListTimeZonesRequest,
|
||||
ParseTimestampInput,
|
||||
UnixToIsoInput,
|
||||
)
|
||||
|
@ -26,9 +28,11 @@ def test_get_current_time() -> None:
|
|||
"""Test the /time/get_time endpoint.
|
||||
|
||||
This test verifies that the endpoint returns the current time in the specified timezone.
|
||||
It checks for a 200 OK status code and that the response contains non-empty 'time' and 'tz' fields.
|
||||
It checks for a 200 OK status code and that the response contains non-empty 'time'
|
||||
and 'tz' fields.
|
||||
"""
|
||||
response = client.post("/time/get_time", json={"timezone": "UTC"})
|
||||
request_data = GetTimeInput(timezone="UTC")
|
||||
response = client.post("/time/current", json=request_data.model_dump())
|
||||
if response.status_code != 200:
|
||||
pytest.fail(f"Expected status code 200, got {response.status_code}")
|
||||
|
||||
|
@ -56,9 +60,9 @@ def test_unix_to_iso() -> None:
|
|||
response = client.post("/time/unix_to_iso", json=request_data.model_dump())
|
||||
assert response.status_code == 200
|
||||
assert "iso_time" in response.json()
|
||||
assert response.json()["iso_time"].startswith("2023-03-15T12:00:00") or response.json()[
|
||||
"iso_time"
|
||||
].startswith("2023-03-15T12:00:00+00:00")
|
||||
# Check that it's a valid ISO timestamp with the expected date
|
||||
iso_time = response.json()["iso_time"]
|
||||
assert "2023-03-15" in iso_time
|
||||
|
||||
|
||||
def test_convert_time() -> None:
|
||||
|
@ -71,7 +75,7 @@ def test_convert_time() -> None:
|
|||
request_data = ConvertTimeInput(
|
||||
timestamp="2023-03-15T12:00:00", from_tz="America/New_York", to_tz="Europe/London"
|
||||
)
|
||||
response = client.post("/time/convert_time", json=request_data.model_dump())
|
||||
response = client.post("/time/convert", json=request_data.model_dump())
|
||||
assert response.status_code == 200
|
||||
assert "converted_time" in response.json()
|
||||
assert response.json()["converted_time"].startswith(
|
||||
|
@ -82,14 +86,15 @@ def test_convert_time() -> None:
|
|||
def test_elapsed_time() -> None:
|
||||
"""Test the /time/elapsed_time endpoint.
|
||||
|
||||
This test verifies that the endpoint correctly calculates the elapsed time between two timestamps.
|
||||
This test verifies that the endpoint correctly calculates the elapsed time between
|
||||
two timestamps.
|
||||
It sends a POST request with start and end timestamps and the desired unit,
|
||||
then checks for a 200 OK status code and the correct elapsed time.
|
||||
"""
|
||||
request_data = ElapsedTimeInput(
|
||||
start="2023-03-15T12:00:00Z", end="2023-03-15T13:00:00Z", units="hours"
|
||||
)
|
||||
response = client.post("/time/elapsed_time", json=request_data.model_dump())
|
||||
response = client.post("/time/elapsed", json=request_data.model_dump())
|
||||
assert response.status_code == 200
|
||||
assert "elapsed" in response.json()
|
||||
assert response.json()["elapsed"] == 1.0
|
||||
|
@ -100,11 +105,11 @@ def test_parse_timestamp() -> None:
|
|||
"""Test the /time/parse_timestamp endpoint.
|
||||
|
||||
This test verifies that the endpoint can parse a human-readable timestamp string
|
||||
into a standardized UTC ISO format. It sends a POST request with a timestamp string
|
||||
into a standardised UTC ISO format. It sends a POST request with a timestamp string
|
||||
and checks for a 200 OK status code and a valid UTC ISO formatted time.
|
||||
"""
|
||||
request_data = ParseTimestampInput(timestamp="June 1st 2024 3:30 PM", timezone="UTC")
|
||||
response = client.post("/time/parse_timestamp", json=request_data.model_dump())
|
||||
response = client.post("/time/parse", json=request_data.model_dump())
|
||||
assert response.status_code == 200
|
||||
assert "utc" in response.json()
|
||||
assert response.json()["utc"].startswith("2024-06-01T15:30:00")
|
||||
|
@ -116,7 +121,8 @@ def test_list_time_zones() -> None:
|
|||
This test verifies that the endpoint returns a list of all valid IANA timezone names.
|
||||
It checks for a 200 OK status code and that the response is a non-empty list of strings.
|
||||
"""
|
||||
response = client.get("/time/list_time_zones")
|
||||
request_data = ListTimeZonesRequest()
|
||||
response = client.post("/time/list_zones", json=request_data.model_dump())
|
||||
assert response.status_code == 200
|
||||
assert isinstance(response.json(), list)
|
||||
assert len(response.json()) > 0
|
||||
|
|
|
@ -16,16 +16,17 @@ from fastapi.testclient import TestClient
|
|||
|
||||
from openapi_mcp_server.core.config import get_app_config
|
||||
from openapi_mcp_server.server import app
|
||||
from openapi_mcp_server.tools.weather.models import WeatherForecastRequest
|
||||
|
||||
client = TestClient(app)
|
||||
|
||||
|
||||
@patch("requests.get")
|
||||
@patch("openapi_mcp_server.tools.weather.routes.WeatherTool.geocode_location")
|
||||
@patch("openapi_mcp_server.tools.weather.responses.geocode_location")
|
||||
@patch("reverse_geocoder.search")
|
||||
def test_get_weather_forecast_with_coords(
|
||||
mock_rg_search: MagicMock,
|
||||
mock_geocode_location: MagicMock,
|
||||
mock_geocode_location: MagicMock, # noqa: ARG001
|
||||
mock_requests_get: MagicMock,
|
||||
) -> None:
|
||||
"""Test the /weather/forecast endpoint with explicit coordinates.
|
||||
|
@ -61,10 +62,11 @@ def test_get_weather_forecast_with_coords(
|
|||
},
|
||||
}
|
||||
|
||||
# Mock reverse geocoder to return a country code (e.g., Germany for Celsius)
|
||||
# Mock reverse geocoder to return a country code (e.g. Germany for Celsius)
|
||||
mock_rg_search.return_value = [{"cc": "DE"}]
|
||||
|
||||
response = client.get("/weather/forecast?latitude=52.52&longitude=13.41")
|
||||
request_data = WeatherForecastRequest(latitude=52.52, longitude=13.41)
|
||||
response = client.post("/weather/forecast", json=request_data.model_dump())
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.json()["latitude"] == 52.52
|
||||
|
@ -84,7 +86,7 @@ def test_get_weather_forecast_with_coords(
|
|||
|
||||
|
||||
@patch("requests.get")
|
||||
@patch("openapi_mcp_server.tools.weather.routes.WeatherTool.geocode_location")
|
||||
@patch("openapi_mcp_server.tools.weather.responses.geocode_location")
|
||||
@patch("reverse_geocoder.search")
|
||||
def test_get_weather_forecast_with_default_location(
|
||||
mock_rg_search: MagicMock,
|
||||
|
@ -131,10 +133,11 @@ def test_get_weather_forecast_with_default_location(
|
|||
},
|
||||
}
|
||||
|
||||
# Mock reverse geocoder to return a country code (e.g., UK for Celsius)
|
||||
# Mock reverse geocoder to return a country code (e.g. UK for Celsius)
|
||||
mock_rg_search.return_value = [{"cc": "GB"}]
|
||||
|
||||
response = client.get("/weather/forecast")
|
||||
request_data = WeatherForecastRequest(latitude=None, longitude=None)
|
||||
response = client.post("/weather/forecast", json=request_data.model_dump())
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.json()["latitude"] == 51.5
|
||||
|
@ -158,10 +161,10 @@ def test_get_weather_forecast_with_default_location(
|
|||
|
||||
|
||||
@patch("requests.get")
|
||||
@patch("openapi_mcp_server.tools.weather.routes.WeatherTool.geocode_location")
|
||||
@patch("openapi_mcp_server.tools.weather.responses.geocode_location")
|
||||
@patch("reverse_geocoder.search")
|
||||
def test_get_weather_forecast_no_coords_no_default(
|
||||
mock_rg_search: MagicMock,
|
||||
mock_rg_search: MagicMock, # noqa: ARG001
|
||||
mock_geocode_location: MagicMock,
|
||||
mock_requests_get: MagicMock,
|
||||
) -> None:
|
||||
|
@ -172,10 +175,11 @@ def test_get_weather_forecast_no_coords_no_default(
|
|||
the API correctly handles this missing input scenario.
|
||||
"""
|
||||
# Temporarily clear default location in the app config for this test
|
||||
original_default_location = get_app_config().default_location
|
||||
original_default_location: str | None = get_app_config().default_location
|
||||
get_app_config().default_location = None
|
||||
|
||||
response = client.get("/weather/forecast")
|
||||
request_data = WeatherForecastRequest(latitude=None, longitude=None)
|
||||
response = client.post("/weather/forecast", json=request_data.model_dump())
|
||||
|
||||
assert response.status_code == 422
|
||||
assert "Latitude and longitude are required" in response.json()["detail"]
|
||||
|
@ -187,10 +191,10 @@ def test_get_weather_forecast_no_coords_no_default(
|
|||
|
||||
|
||||
@patch("requests.get")
|
||||
@patch("openapi_mcp_server.tools.weather.routes.WeatherTool.geocode_location")
|
||||
@patch("openapi_mcp_server.tools.weather.responses.geocode_location")
|
||||
@patch("reverse_geocoder.search")
|
||||
def test_get_weather_forecast_geocode_failure(
|
||||
mock_rg_search: MagicMock,
|
||||
mock_rg_search: MagicMock, # noqa: ARG001
|
||||
mock_geocode_location: MagicMock,
|
||||
mock_requests_get: MagicMock,
|
||||
) -> None:
|
||||
|
@ -207,7 +211,8 @@ def test_get_weather_forecast_geocode_failure(
|
|||
# Mock geocoding to return None (failure)
|
||||
mock_geocode_location.return_value = None
|
||||
|
||||
response = client.get("/weather/forecast")
|
||||
request_data = WeatherForecastRequest(latitude=None, longitude=None)
|
||||
response = client.post("/weather/forecast", json=request_data.model_dump())
|
||||
|
||||
assert response.status_code == 422
|
||||
assert "Could not geocode default location" in response.json()["detail"]
|
||||
|
@ -219,26 +224,27 @@ def test_get_weather_forecast_geocode_failure(
|
|||
|
||||
|
||||
@patch("requests.get")
|
||||
@patch("openapi_mcp_server.tools.weather.routes.WeatherTool.geocode_location")
|
||||
@patch("openapi_mcp_server.tools.weather.responses.geocode_location")
|
||||
@patch("reverse_geocoder.search")
|
||||
def test_get_weather_forecast_api_error(
|
||||
mock_rg_search: MagicMock,
|
||||
mock_geocode_location: MagicMock,
|
||||
mock_geocode_location: MagicMock, # noqa: ARG001
|
||||
mock_requests_get: MagicMock,
|
||||
) -> None:
|
||||
"""Test the /weather/forecast endpoint when the external weather API returns an error.
|
||||
|
||||
This test verifies that the endpoint correctly handles errors from the Open-Meteo API.
|
||||
It mocks the external API call to raise an exception (e.g., connection error) and checks
|
||||
It mocks the external API call to raise an exception (e.g. connection error) and checks
|
||||
for a 503 Service Unavailable status code.
|
||||
"""
|
||||
# Mock external API to raise an exception
|
||||
mock_requests_get.side_effect = requests.exceptions.RequestException("Connection error")
|
||||
|
||||
# Mock reverse geocoder to return a country code (e.g., Germany for Celsius)
|
||||
# Mock reverse geocoder to return a country code (e.g. Germany for Celsius)
|
||||
mock_rg_search.return_value = [{"cc": "DE"}]
|
||||
|
||||
response = client.get("/weather/forecast?latitude=52.52&longitude=13.41")
|
||||
request_data = WeatherForecastRequest(latitude=52.52, longitude=13.41)
|
||||
response = client.post("/weather/forecast", json=request_data.model_dump())
|
||||
|
||||
assert response.status_code == 503
|
||||
assert "Error connecting to Open-Meteo API" in response.json()["detail"]
|
||||
|
|
|
@ -10,19 +10,19 @@ from __future__ import annotations
|
|||
|
||||
from unittest.mock import MagicMock, patch
|
||||
|
||||
import requests.exceptions
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
from openapi_mcp_server.server import app
|
||||
from openapi_mcp_server.tools.web.models import WebRequest
|
||||
|
||||
client = TestClient(app)
|
||||
|
||||
|
||||
@patch("trafilatura.fetch_url")
|
||||
@patch("openapi_mcp_server.tools.web.responses.requests.get")
|
||||
@patch("trafilatura.extract")
|
||||
def test_web_read_success(
|
||||
mock_trafilatura_extract: MagicMock,
|
||||
mock_trafilatura_fetch_url: MagicMock,
|
||||
mock_requests_get: MagicMock,
|
||||
) -> None:
|
||||
"""Test the /web/web_read endpoint for successful content extraction.
|
||||
|
||||
|
@ -31,65 +31,93 @@ def test_web_read_success(
|
|||
a successful content fetch and extraction, and checks for a 200 OK status code
|
||||
and the expected content in the response.
|
||||
"""
|
||||
mock_trafilatura_fetch_url.return_value = (
|
||||
"<html><body><h1>Test</h1><p>Content</p></body></html>"
|
||||
)
|
||||
mock_response = MagicMock()
|
||||
mock_response.status_code = 200
|
||||
mock_response.text = "<html><body><h1>Test</h1><p>Content</p></body></html>"
|
||||
mock_response.raise_for_status = MagicMock()
|
||||
mock_requests_get.return_value = mock_response
|
||||
mock_trafilatura_extract.return_value = "# Test\n\nContent"
|
||||
|
||||
request_data = WebRequest(url="http://example.com")
|
||||
response = client.post("/web/web_read", json=request_data.model_dump())
|
||||
request_data = {"url": "http://example.com"}
|
||||
response = client.post("/web/read", json=request_data)
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.json()["url"] == "http://example.com"
|
||||
assert response.json()["url"] == "http://example.com/"
|
||||
assert response.json()["content"] == "# Test\n\nContent"
|
||||
mock_trafilatura_fetch_url.assert_called_once_with("http://example.com")
|
||||
mock_requests_get.assert_called_once_with(
|
||||
"http://example.com/",
|
||||
headers={
|
||||
"User-Agent": (
|
||||
"Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0esr"
|
||||
)
|
||||
},
|
||||
timeout=30,
|
||||
)
|
||||
mock_trafilatura_extract.assert_called_once()
|
||||
|
||||
|
||||
@patch("trafilatura.fetch_url")
|
||||
@patch("openapi_mcp_server.tools.web.responses.requests.get")
|
||||
@patch("trafilatura.extract")
|
||||
def test_web_read_fetch_failure(
|
||||
mock_trafilatura_extract: MagicMock,
|
||||
mock_trafilatura_fetch_url: MagicMock,
|
||||
mock_requests_get: MagicMock,
|
||||
) -> None:
|
||||
"""Test the /web/web_read endpoint when fetching the URL fails.
|
||||
|
||||
This test verifies that the endpoint returns a 404 Not Found error when `trafilatura.fetch_url`
|
||||
returns None, indicating a failure to fetch the web page. It ensures proper error handling.
|
||||
"""
|
||||
mock_trafilatura_fetch_url.return_value = None
|
||||
mock_requests_get.side_effect = requests.exceptions.RequestException("Connection error")
|
||||
|
||||
request_data = WebRequest(url="http://nonexistent.com")
|
||||
response = client.post("/web/web_read", json=request_data.model_dump())
|
||||
request_data = {"url": "http://nonexistent.com"}
|
||||
response = client.post("/web/read", json=request_data)
|
||||
|
||||
assert response.status_code == 404
|
||||
assert "Unable to fetch content from URL" in response.json()["detail"]
|
||||
mock_trafilatura_fetch_url.assert_called_once_with("http://nonexistent.com")
|
||||
assert response.status_code == 500
|
||||
assert "Error parsing web page" in response.json()["detail"]
|
||||
mock_requests_get.assert_called_once_with(
|
||||
"http://nonexistent.com/",
|
||||
headers={
|
||||
"User-Agent": (
|
||||
"Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0esr"
|
||||
)
|
||||
},
|
||||
timeout=30,
|
||||
)
|
||||
mock_trafilatura_extract.assert_not_called()
|
||||
|
||||
|
||||
@patch("trafilatura.fetch_url")
|
||||
@patch("openapi_mcp_server.tools.web.responses.requests.get")
|
||||
@patch("trafilatura.extract")
|
||||
def test_web_read_extraction_failure(
|
||||
mock_trafilatura_extract: MagicMock,
|
||||
mock_trafilatura_fetch_url: MagicMock,
|
||||
mock_requests_get: MagicMock,
|
||||
) -> None:
|
||||
"""Test the /web/web_read endpoint when content extraction fails.
|
||||
|
||||
This test verifies that the endpoint returns a 422 Unprocessable Entity error when
|
||||
`trafilatura.extract` returns None, indicating a failure to extract meaningful content.
|
||||
"""
|
||||
mock_trafilatura_fetch_url.return_value = (
|
||||
"<html><body><p>No extractable content</p></body></html>"
|
||||
)
|
||||
mock_response = MagicMock()
|
||||
mock_response.status_code = 200
|
||||
mock_response.text = "<html><body><p>No extractable content</p></body></html>"
|
||||
mock_response.raise_for_status = MagicMock()
|
||||
mock_requests_get.return_value = mock_response
|
||||
mock_trafilatura_extract.return_value = None
|
||||
|
||||
request_data = WebRequest(url="http://example.com/empty")
|
||||
response = client.post("/web/web_read", json=request_data.model_dump())
|
||||
request_data = {"url": "http://example.com/empty"}
|
||||
response = client.post("/web/read", json=request_data)
|
||||
|
||||
assert response.status_code == 422
|
||||
assert "Unable to extract content from URL" in response.json()["detail"]
|
||||
mock_trafilatura_fetch_url.assert_called_once_with("http://example.com/empty")
|
||||
mock_requests_get.assert_called_once_with(
|
||||
"http://example.com/empty",
|
||||
headers={
|
||||
"User-Agent": (
|
||||
"Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0esr"
|
||||
)
|
||||
},
|
||||
timeout=30,
|
||||
)
|
||||
mock_trafilatura_extract.assert_called_once()
|
||||
|
||||
|
||||
|
@ -105,16 +133,26 @@ def test_web_raw_success(mock_requests_get: MagicMock) -> None:
|
|||
mock_response.status_code = 200
|
||||
mock_response.text = "<html>Raw HTML</html>"
|
||||
mock_response.headers = {"Content-Type": "text/html"}
|
||||
mock_response.raise_for_status = MagicMock()
|
||||
mock_requests_get.return_value = mock_response
|
||||
|
||||
response = client.get("/web/web_raw?url=http://example.com/raw")
|
||||
request_data = {"url": "http://example.com/raw"}
|
||||
response = client.post("/web/raw", json=request_data)
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.json()["url"] == "http://example.com/raw"
|
||||
assert response.json()["content"] == "<html>Raw HTML</html>"
|
||||
assert response.json()["status_code"] == 200
|
||||
assert response.json()["headers"]["Content-Type"] == "text/html"
|
||||
mock_requests_get.assert_called_once_with("http://example.com/raw", timeout=30)
|
||||
mock_requests_get.assert_called_once_with(
|
||||
"http://example.com/raw",
|
||||
headers={
|
||||
"User-Agent": (
|
||||
"Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0esr"
|
||||
)
|
||||
},
|
||||
timeout=30,
|
||||
)
|
||||
|
||||
|
||||
@patch("requests.get")
|
||||
|
@ -127,8 +165,17 @@ def test_web_raw_fetch_failure(mock_requests_get: MagicMock) -> None:
|
|||
"""
|
||||
mock_requests_get.side_effect = requests.exceptions.RequestException("Connection error")
|
||||
|
||||
response = client.get("/web/web_raw?url=http://nonexistent.com/raw")
|
||||
request_data = {"url": "http://nonexistent.com/raw"}
|
||||
response = client.post("/web/raw", json=request_data)
|
||||
|
||||
assert response.status_code == 503
|
||||
assert "Unable to fetch content from URL" in response.json()["detail"]
|
||||
mock_requests_get.assert_called_once_with("http://nonexistent.com/raw", timeout=30)
|
||||
mock_requests_get.assert_called_once_with(
|
||||
"http://nonexistent.com/raw",
|
||||
headers={
|
||||
"User-Agent": (
|
||||
"Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0esr"
|
||||
)
|
||||
},
|
||||
timeout=30,
|
||||
)
|
||||
|
|
|
@ -1,10 +1,17 @@
|
|||
"""Tests for the health endpoint."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
import pytest
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_health_endpoint(client) -> None:
|
||||
def test_health_endpoint(client: TestClient) -> None:
|
||||
"""Test the global health endpoint."""
|
||||
response = client.get("/health")
|
||||
|
||||
|
@ -16,7 +23,7 @@ def test_health_endpoint(client) -> None:
|
|||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_health_endpoint_response_schema(client) -> None:
|
||||
def test_health_endpoint_response_schema(client: TestClient) -> None:
|
||||
"""Test that health endpoint returns correct schema."""
|
||||
response = client.get("/health")
|
||||
|
||||
|
@ -31,7 +38,7 @@ def test_health_endpoint_response_schema(client) -> None:
|
|||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_health_endpoint_headers(client) -> None:
|
||||
def test_health_endpoint_headers(client: TestClient) -> None:
|
||||
"""Test health endpoint returns correct headers."""
|
||||
response = client.get("/health")
|
||||
|
||||
|
|
|
@ -1,10 +1,18 @@
|
|||
"""Tests for the FastAPI server."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
import pytest
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from fastapi import FastAPI
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_server_creation(app) -> None:
|
||||
def test_server_creation(app: FastAPI) -> None:
|
||||
"""Test that the FastAPI app is created successfully."""
|
||||
assert app is not None
|
||||
assert hasattr(app, "routes")
|
||||
|
@ -12,7 +20,7 @@ def test_server_creation(app) -> None:
|
|||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_openapi_docs_endpoint(client) -> None:
|
||||
def test_openapi_docs_endpoint(client: TestClient) -> None:
|
||||
"""Test that OpenAPI docs endpoint is accessible."""
|
||||
response = client.get("/docs")
|
||||
assert response.status_code == 200
|
||||
|
@ -20,7 +28,7 @@ def test_openapi_docs_endpoint(client) -> None:
|
|||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_redoc_endpoint(client) -> None:
|
||||
def test_redoc_endpoint(client: TestClient) -> None:
|
||||
"""Test that ReDoc endpoint is accessible."""
|
||||
response = client.get("/redoc")
|
||||
assert response.status_code == 200
|
||||
|
@ -28,7 +36,7 @@ def test_redoc_endpoint(client) -> None:
|
|||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_openapi_json_endpoint(client) -> None:
|
||||
def test_openapi_json_endpoint(client: TestClient) -> None:
|
||||
"""Test that OpenAPI JSON schema is accessible."""
|
||||
response = client.get("/openapi.json")
|
||||
assert response.status_code == 200
|
||||
|
@ -41,7 +49,7 @@ def test_openapi_json_endpoint(client) -> None:
|
|||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_tool_endpoints_registered(client) -> None:
|
||||
def test_tool_endpoints_registered(client: TestClient) -> None:
|
||||
"""Test that all tool endpoints are registered."""
|
||||
response = client.get("/openapi.json")
|
||||
assert response.status_code == 200
|
||||
|
@ -58,18 +66,25 @@ def test_tool_endpoints_registered(client) -> None:
|
|||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_cors_headers(client) -> None:
|
||||
def test_cors_headers(client: TestClient) -> None:
|
||||
"""Test that CORS headers are set correctly."""
|
||||
response = client.get("/health")
|
||||
# Test with an OPTIONS request which should trigger CORS headers
|
||||
response = client.options("/health", headers={"Origin": "http://example.com"})
|
||||
assert response.status_code == 200
|
||||
|
||||
# Check for presence of CORS headers
|
||||
headers = response.headers
|
||||
assert "access-control-allow-origin" in headers or "Access-Control-Allow-Origin" in headers
|
||||
|
||||
# Also test with a cross-origin GET request
|
||||
response = client.get("/health", headers={"Origin": "http://example.com"})
|
||||
assert response.status_code == 200
|
||||
headers = response.headers
|
||||
assert "access-control-allow-origin" in headers or "Access-Control-Allow-Origin" in headers
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_404_handling(client) -> None:
|
||||
def test_404_handling(client: TestClient) -> None:
|
||||
"""Test that 404 errors are handled properly."""
|
||||
response = client.get("/nonexistent/endpoint")
|
||||
assert response.status_code == 404
|
||||
|
|
|
@ -1 +1,3 @@
|
|||
"""Unit tests package for the OpenAPI MCP Server."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
|
|
@ -1 +1,3 @@
|
|||
"""Unit tests for tool modules."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
|
404
uv.lock
generated
404
uv.lock
generated
|
@ -35,11 +35,11 @@ wheels = [
|
|||
|
||||
[[package]]
|
||||
name = "certifi"
|
||||
version = "2025.7.9"
|
||||
version = "2025.7.14"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/de/8a/c729b6b60c66a38f590c4e774decc4b2ec7b0576be8f1aa984a53ffa812a/certifi-2025.7.9.tar.gz", hash = "sha256:c1d2ec05395148ee10cf672ffc28cd37ea0ab0d99f9cc74c43e588cbd111b079", size = 160386, upload-time = "2025-07-09T02:13:58.874Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/b3/76/52c535bcebe74590f296d6c77c86dabf761c41980e1347a2422e4aa2ae41/certifi-2025.7.14.tar.gz", hash = "sha256:8ea99dbdfaaf2ba2f9bac77b9249ef62ec5218e7c2b2e903378ed5fccf765995", size = 163981, upload-time = "2025-07-14T03:29:28.449Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/66/f3/80a3f974c8b535d394ff960a11ac20368e06b736da395b551a49ce950cce/certifi-2025.7.9-py3-none-any.whl", hash = "sha256:d842783a14f8fdd646895ac26f719a061408834473cfc10203f6a575beb15d39", size = 159230, upload-time = "2025-07-09T02:13:57.007Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4f/52/34c6cf5bb9285074dc3531c437b3919e825d976fde097a7a73f79e726d03/certifi-2025.7.14-py3-none-any.whl", hash = "sha256:6b31f564a415d79ee77df69d757bb49a5bb53bd9f756cbbe24394ffd6fc1f4b2", size = 162722, upload-time = "2025-07-14T03:29:26.863Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
@ -101,33 +101,55 @@ wheels = [
|
|||
|
||||
[[package]]
|
||||
name = "coverage"
|
||||
version = "7.9.2"
|
||||
version = "7.10.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/04/b7/c0465ca253df10a9e8dae0692a4ae6e9726d245390aaef92360e1d6d3832/coverage-7.9.2.tar.gz", hash = "sha256:997024fa51e3290264ffd7492ec97d0690293ccd2b45a6cd7d82d945a4a80c8b", size = 813556, upload-time = "2025-07-03T10:54:15.101Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/87/0e/66dbd4c6a7f0758a8d18044c048779ba21fb94856e1edcf764bd5403e710/coverage-7.10.1.tar.gz", hash = "sha256:ae2b4856f29ddfe827106794f3589949a57da6f0d38ab01e24ec35107979ba57", size = 819938, upload-time = "2025-07-27T14:13:39.045Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/94/9d/7a8edf7acbcaa5e5c489a646226bed9591ee1c5e6a84733c0140e9ce1ae1/coverage-7.9.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:985abe7f242e0d7bba228ab01070fde1d6c8fa12f142e43debe9ed1dde686038", size = 212367, upload-time = "2025-07-03T10:53:25.811Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e8/9e/5cd6f130150712301f7e40fb5865c1bc27b97689ec57297e568d972eec3c/coverage-7.9.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:82c3939264a76d44fde7f213924021ed31f55ef28111a19649fec90c0f109e6d", size = 212632, upload-time = "2025-07-03T10:53:27.075Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a8/de/6287a2c2036f9fd991c61cefa8c64e57390e30c894ad3aa52fac4c1e14a8/coverage-7.9.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ae5d563e970dbe04382f736ec214ef48103d1b875967c89d83c6e3f21706d5b3", size = 245793, upload-time = "2025-07-03T10:53:28.408Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/06/cc/9b5a9961d8160e3cb0b558c71f8051fe08aa2dd4b502ee937225da564ed1/coverage-7.9.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bdd612e59baed2a93c8843c9a7cb902260f181370f1d772f4842987535071d14", size = 243006, upload-time = "2025-07-03T10:53:29.754Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/49/d9/4616b787d9f597d6443f5588619c1c9f659e1f5fc9eebf63699eb6d34b78/coverage-7.9.2-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:256ea87cb2a1ed992bcdfc349d8042dcea1b80436f4ddf6e246d6bee4b5d73b6", size = 244990, upload-time = "2025-07-03T10:53:31.098Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/48/83/801cdc10f137b2d02b005a761661649ffa60eb173dcdaeb77f571e4dc192/coverage-7.9.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f44ae036b63c8ea432f610534a2668b0c3aee810e7037ab9d8ff6883de480f5b", size = 245157, upload-time = "2025-07-03T10:53:32.717Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c8/a4/41911ed7e9d3ceb0ffb019e7635468df7499f5cc3edca5f7dfc078e9c5ec/coverage-7.9.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:82d76ad87c932935417a19b10cfe7abb15fd3f923cfe47dbdaa74ef4e503752d", size = 243128, upload-time = "2025-07-03T10:53:34.009Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/10/41/344543b71d31ac9cb00a664d5d0c9ef134a0fe87cb7d8430003b20fa0b7d/coverage-7.9.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:619317bb86de4193debc712b9e59d5cffd91dc1d178627ab2a77b9870deb2868", size = 244511, upload-time = "2025-07-03T10:53:35.434Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d5/81/3b68c77e4812105e2a060f6946ba9e6f898ddcdc0d2bfc8b4b152a9ae522/coverage-7.9.2-cp313-cp313-win32.whl", hash = "sha256:0a07757de9feb1dfafd16ab651e0f628fd7ce551604d1bf23e47e1ddca93f08a", size = 214765, upload-time = "2025-07-03T10:53:36.787Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/06/a2/7fac400f6a346bb1a4004eb2a76fbff0e242cd48926a2ce37a22a6a1d917/coverage-7.9.2-cp313-cp313-win_amd64.whl", hash = "sha256:115db3d1f4d3f35f5bb021e270edd85011934ff97c8797216b62f461dd69374b", size = 215536, upload-time = "2025-07-03T10:53:38.188Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/08/47/2c6c215452b4f90d87017e61ea0fd9e0486bb734cb515e3de56e2c32075f/coverage-7.9.2-cp313-cp313-win_arm64.whl", hash = "sha256:48f82f889c80af8b2a7bb6e158d95a3fbec6a3453a1004d04e4f3b5945a02694", size = 213943, upload-time = "2025-07-03T10:53:39.492Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a3/46/e211e942b22d6af5e0f323faa8a9bc7c447a1cf1923b64c47523f36ed488/coverage-7.9.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:55a28954545f9d2f96870b40f6c3386a59ba8ed50caf2d949676dac3ecab99f5", size = 213088, upload-time = "2025-07-03T10:53:40.874Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d2/2f/762551f97e124442eccd907bf8b0de54348635b8866a73567eb4e6417acf/coverage-7.9.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:cdef6504637731a63c133bb2e6f0f0214e2748495ec15fe42d1e219d1b133f0b", size = 213298, upload-time = "2025-07-03T10:53:42.218Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7a/b7/76d2d132b7baf7360ed69be0bcab968f151fa31abe6d067f0384439d9edb/coverage-7.9.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bcd5ebe66c7a97273d5d2ddd4ad0ed2e706b39630ed4b53e713d360626c3dbb3", size = 256541, upload-time = "2025-07-03T10:53:43.823Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a0/17/392b219837d7ad47d8e5974ce5f8dc3deb9f99a53b3bd4d123602f960c81/coverage-7.9.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9303aed20872d7a3c9cb39c5d2b9bdbe44e3a9a1aecb52920f7e7495410dfab8", size = 252761, upload-time = "2025-07-03T10:53:45.19Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d5/77/4256d3577fe1b0daa8d3836a1ebe68eaa07dd2cbaf20cf5ab1115d6949d4/coverage-7.9.2-cp313-cp313t-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc18ea9e417a04d1920a9a76fe9ebd2f43ca505b81994598482f938d5c315f46", size = 254917, upload-time = "2025-07-03T10:53:46.931Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/53/99/fc1a008eef1805e1ddb123cf17af864743354479ea5129a8f838c433cc2c/coverage-7.9.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6406cff19880aaaadc932152242523e892faff224da29e241ce2fca329866584", size = 256147, upload-time = "2025-07-03T10:53:48.289Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/92/c0/f63bf667e18b7f88c2bdb3160870e277c4874ced87e21426128d70aa741f/coverage-7.9.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:2d0d4f6ecdf37fcc19c88fec3e2277d5dee740fb51ffdd69b9579b8c31e4232e", size = 254261, upload-time = "2025-07-03T10:53:49.99Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8c/32/37dd1c42ce3016ff8ec9e4b607650d2e34845c0585d3518b2a93b4830c1a/coverage-7.9.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:c33624f50cf8de418ab2b4d6ca9eda96dc45b2c4231336bac91454520e8d1fac", size = 255099, upload-time = "2025-07-03T10:53:51.354Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/da/2e/af6b86f7c95441ce82f035b3affe1cd147f727bbd92f563be35e2d585683/coverage-7.9.2-cp313-cp313t-win32.whl", hash = "sha256:1df6b76e737c6a92210eebcb2390af59a141f9e9430210595251fbaf02d46926", size = 215440, upload-time = "2025-07-03T10:53:52.808Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4d/bb/8a785d91b308867f6b2e36e41c569b367c00b70c17f54b13ac29bcd2d8c8/coverage-7.9.2-cp313-cp313t-win_amd64.whl", hash = "sha256:f5fd54310b92741ebe00d9c0d1d7b2b27463952c022da6d47c175d246a98d1bd", size = 216537, upload-time = "2025-07-03T10:53:54.273Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1d/a0/a6bffb5e0f41a47279fd45a8f3155bf193f77990ae1c30f9c224b61cacb0/coverage-7.9.2-cp313-cp313t-win_arm64.whl", hash = "sha256:c48c2375287108c887ee87d13b4070a381c6537d30e8487b24ec721bf2a781cb", size = 214398, upload-time = "2025-07-03T10:53:56.715Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3c/38/bbe2e63902847cf79036ecc75550d0698af31c91c7575352eb25190d0fb3/coverage-7.9.2-py3-none-any.whl", hash = "sha256:e425cd5b00f6fc0ed7cdbd766c70be8baab4b7839e4d4fe5fac48581dd968ea4", size = 204005, upload-time = "2025-07-03T10:54:13.491Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ef/72/135ff5fef09b1ffe78dbe6fcf1e16b2e564cd35faeacf3d63d60d887f12d/coverage-7.10.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ebb08d0867c5a25dffa4823377292a0ffd7aaafb218b5d4e2e106378b1061e39", size = 214960, upload-time = "2025-07-27T14:11:55.959Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b1/aa/73a5d1a6fc08ca709a8177825616aa95ee6bf34d522517c2595484a3e6c9/coverage-7.10.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f32a95a83c2e17422f67af922a89422cd24c6fa94041f083dd0bb4f6057d0bc7", size = 215220, upload-time = "2025-07-27T14:11:57.899Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8d/40/3124fdd45ed3772a42fc73ca41c091699b38a2c3bd4f9cb564162378e8b6/coverage-7.10.1-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:c4c746d11c8aba4b9f58ca8bfc6fbfd0da4efe7960ae5540d1a1b13655ee8892", size = 245772, upload-time = "2025-07-27T14:12:00.422Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/42/62/a77b254822efa8c12ad59e8039f2bc3df56dc162ebda55e1943e35ba31a5/coverage-7.10.1-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:7f39edd52c23e5c7ed94e0e4bf088928029edf86ef10b95413e5ea670c5e92d7", size = 248116, upload-time = "2025-07-27T14:12:03.099Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1d/01/8101f062f472a3a6205b458d18ef0444a63ae5d36a8a5ed5dd0f6167f4db/coverage-7.10.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ab6e19b684981d0cd968906e293d5628e89faacb27977c92f3600b201926b994", size = 249554, upload-time = "2025-07-27T14:12:04.668Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8f/7b/e51bc61573e71ff7275a4f167aecbd16cb010aefdf54bcd8b0a133391263/coverage-7.10.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:5121d8cf0eacb16133501455d216bb5f99899ae2f52d394fe45d59229e6611d0", size = 247766, upload-time = "2025-07-27T14:12:06.234Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4b/71/1c96d66a51d4204a9d6d12df53c4071d87e110941a2a1fe94693192262f5/coverage-7.10.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:df1c742ca6f46a6f6cbcaef9ac694dc2cb1260d30a6a2f5c68c5f5bcfee1cfd7", size = 245735, upload-time = "2025-07-27T14:12:08.305Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/13/d5/efbc2ac4d35ae2f22ef6df2ca084c60e13bd9378be68655e3268c80349ab/coverage-7.10.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:40f9a38676f9c073bf4b9194707aa1eb97dca0e22cc3766d83879d72500132c7", size = 247118, upload-time = "2025-07-27T14:12:09.903Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d1/22/073848352bec28ca65f2b6816b892fcf9a31abbef07b868487ad15dd55f1/coverage-7.10.1-cp313-cp313-win32.whl", hash = "sha256:2348631f049e884839553b9974f0821d39241c6ffb01a418efce434f7eba0fe7", size = 217381, upload-time = "2025-07-27T14:12:11.535Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b7/df/df6a0ff33b042f000089bd11b6bb034bab073e2ab64a56e78ed882cba55d/coverage-7.10.1-cp313-cp313-win_amd64.whl", hash = "sha256:4072b31361b0d6d23f750c524f694e1a417c1220a30d3ef02741eed28520c48e", size = 218152, upload-time = "2025-07-27T14:12:13.182Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/30/e3/5085ca849a40ed6b47cdb8f65471c2f754e19390b5a12fa8abd25cbfaa8f/coverage-7.10.1-cp313-cp313-win_arm64.whl", hash = "sha256:3e31dfb8271937cab9425f19259b1b1d1f556790e98eb266009e7a61d337b6d4", size = 216559, upload-time = "2025-07-27T14:12:14.807Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cc/93/58714efbfdeb547909feaabe1d67b2bdd59f0597060271b9c548d5efb529/coverage-7.10.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:1c4f679c6b573a5257af6012f167a45be4c749c9925fd44d5178fd641ad8bf72", size = 215677, upload-time = "2025-07-27T14:12:16.68Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c0/0c/18eaa5897e7e8cb3f8c45e563e23e8a85686b4585e29d53cacb6bc9cb340/coverage-7.10.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:871ebe8143da284bd77b84a9136200bd638be253618765d21a1fce71006d94af", size = 215899, upload-time = "2025-07-27T14:12:18.758Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/84/c1/9d1affacc3c75b5a184c140377701bbf14fc94619367f07a269cd9e4fed6/coverage-7.10.1-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:998c4751dabf7d29b30594af416e4bf5091f11f92a8d88eb1512c7ba136d1ed7", size = 257140, upload-time = "2025-07-27T14:12:20.357Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3d/0f/339bc6b8fa968c346df346068cca1f24bdea2ddfa93bb3dc2e7749730962/coverage-7.10.1-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:780f750a25e7749d0af6b3631759c2c14f45de209f3faaa2398312d1c7a22759", size = 259005, upload-time = "2025-07-27T14:12:22.007Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c8/22/89390864b92ea7c909079939b71baba7e5b42a76bf327c1d615bd829ba57/coverage-7.10.1-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:590bdba9445df4763bdbebc928d8182f094c1f3947a8dc0fc82ef014dbdd8324", size = 261143, upload-time = "2025-07-27T14:12:23.746Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2c/56/3d04d89017c0c41c7a71bd69b29699d919b6bbf2649b8b2091240b97dd6a/coverage-7.10.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9b2df80cb6a2af86d300e70acb82e9b79dab2c1e6971e44b78dbfc1a1e736b53", size = 258735, upload-time = "2025-07-27T14:12:25.73Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cb/40/312252c8afa5ca781063a09d931f4b9409dc91526cd0b5a2b84143ffafa2/coverage-7.10.1-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:d6a558c2725bfb6337bf57c1cd366c13798bfd3bfc9e3dd1f4a6f6fc95a4605f", size = 256871, upload-time = "2025-07-27T14:12:27.767Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1f/2b/564947d5dede068215aaddb9e05638aeac079685101462218229ddea9113/coverage-7.10.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:e6150d167f32f2a54690e572e0a4c90296fb000a18e9b26ab81a6489e24e78dd", size = 257692, upload-time = "2025-07-27T14:12:29.347Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/93/1b/c8a867ade85cb26d802aea2209b9c2c80613b9c122baa8c8ecea6799648f/coverage-7.10.1-cp313-cp313t-win32.whl", hash = "sha256:d946a0c067aa88be4a593aad1236493313bafaa27e2a2080bfe88db827972f3c", size = 218059, upload-time = "2025-07-27T14:12:31.076Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a1/fe/cd4ab40570ae83a516bf5e754ea4388aeedd48e660e40c50b7713ed4f930/coverage-7.10.1-cp313-cp313t-win_amd64.whl", hash = "sha256:e37c72eaccdd5ed1130c67a92ad38f5b2af66eeff7b0abe29534225db2ef7b18", size = 219150, upload-time = "2025-07-27T14:12:32.746Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8d/16/6e5ed5854be6d70d0c39e9cb9dd2449f2c8c34455534c32c1a508c7dbdb5/coverage-7.10.1-cp313-cp313t-win_arm64.whl", hash = "sha256:89ec0ffc215c590c732918c95cd02b55c7d0f569d76b90bb1a5e78aa340618e4", size = 217014, upload-time = "2025-07-27T14:12:34.406Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/54/8e/6d0bfe9c3d7121cf936c5f8b03e8c3da1484fb801703127dba20fb8bd3c7/coverage-7.10.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:166d89c57e877e93d8827dac32cedae6b0277ca684c6511497311249f35a280c", size = 214951, upload-time = "2025-07-27T14:12:36.069Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f2/29/e3e51a8c653cf2174c60532aafeb5065cea0911403fa144c9abe39790308/coverage-7.10.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:bed4a2341b33cd1a7d9ffc47df4a78ee61d3416d43b4adc9e18b7d266650b83e", size = 215229, upload-time = "2025-07-27T14:12:37.759Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e0/59/3c972080b2fa18b6c4510201f6d4dc87159d450627d062cd9ad051134062/coverage-7.10.1-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:ddca1e4f5f4c67980533df01430184c19b5359900e080248bbf4ed6789584d8b", size = 245738, upload-time = "2025-07-27T14:12:39.453Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2e/04/fc0d99d3f809452654e958e1788454f6e27b34e43f8f8598191c8ad13537/coverage-7.10.1-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:37b69226001d8b7de7126cad7366b0778d36777e4d788c66991455ba817c5b41", size = 248045, upload-time = "2025-07-27T14:12:41.387Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5e/2e/afcbf599e77e0dfbf4c97197747250d13d397d27e185b93987d9eaac053d/coverage-7.10.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b2f22102197bcb1722691296f9e589f02b616f874e54a209284dd7b9294b0b7f", size = 249666, upload-time = "2025-07-27T14:12:43.056Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6e/ae/bc47f7f8ecb7a06cbae2bf86a6fa20f479dd902bc80f57cff7730438059d/coverage-7.10.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:1e0c768b0f9ac5839dac5cf88992a4bb459e488ee8a1f8489af4cb33b1af00f1", size = 247692, upload-time = "2025-07-27T14:12:44.83Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b6/26/cbfa3092d31ccba8ba7647e4d25753263e818b4547eba446b113d7d1efdf/coverage-7.10.1-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:991196702d5e0b120a8fef2664e1b9c333a81d36d5f6bcf6b225c0cf8b0451a2", size = 245536, upload-time = "2025-07-27T14:12:46.527Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/56/77/9c68e92500e6a1c83d024a70eadcc9a173f21aadd73c4675fe64c9c43fdf/coverage-7.10.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:ae8e59e5f4fd85d6ad34c2bb9d74037b5b11be072b8b7e9986beb11f957573d4", size = 246954, upload-time = "2025-07-27T14:12:49.279Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7f/a5/ba96671c5a669672aacd9877a5987c8551501b602827b4e84256da2a30a7/coverage-7.10.1-cp314-cp314-win32.whl", hash = "sha256:042125c89cf74a074984002e165d61fe0e31c7bd40ebb4bbebf07939b5924613", size = 217616, upload-time = "2025-07-27T14:12:51.214Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e7/3c/e1e1eb95fc1585f15a410208c4795db24a948e04d9bde818fe4eb893bc85/coverage-7.10.1-cp314-cp314-win_amd64.whl", hash = "sha256:a22c3bfe09f7a530e2c94c87ff7af867259c91bef87ed2089cd69b783af7b84e", size = 218412, upload-time = "2025-07-27T14:12:53.429Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b0/85/7e1e5be2cb966cba95566ba702b13a572ca744fbb3779df9888213762d67/coverage-7.10.1-cp314-cp314-win_arm64.whl", hash = "sha256:ee6be07af68d9c4fca4027c70cea0c31a0f1bc9cb464ff3c84a1f916bf82e652", size = 216776, upload-time = "2025-07-27T14:12:55.482Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/62/0f/5bb8f29923141cca8560fe2217679caf4e0db643872c1945ac7d8748c2a7/coverage-7.10.1-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:d24fb3c0c8ff0d517c5ca5de7cf3994a4cd559cde0315201511dbfa7ab528894", size = 215698, upload-time = "2025-07-27T14:12:57.225Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/80/29/547038ffa4e8e4d9e82f7dfc6d152f75fcdc0af146913f0ba03875211f03/coverage-7.10.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1217a54cfd79be20512a67ca81c7da3f2163f51bbfd188aab91054df012154f5", size = 215902, upload-time = "2025-07-27T14:12:59.071Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e1/8a/7aaa8fbfaed900147987a424e112af2e7790e1ac9cd92601e5bd4e1ba60a/coverage-7.10.1-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:51f30da7a52c009667e02f125737229d7d8044ad84b79db454308033a7808ab2", size = 257230, upload-time = "2025-07-27T14:13:01.248Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e5/1d/c252b5ffac44294e23a0d79dd5acf51749b39795ccc898faeabf7bee903f/coverage-7.10.1-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:ed3718c757c82d920f1c94089066225ca2ad7f00bb904cb72b1c39ebdd906ccb", size = 259194, upload-time = "2025-07-27T14:13:03.247Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/16/ad/6c8d9f83d08f3bac2e7507534d0c48d1a4f52c18e6f94919d364edbdfa8f/coverage-7.10.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:cc452481e124a819ced0c25412ea2e144269ef2f2534b862d9f6a9dae4bda17b", size = 261316, upload-time = "2025-07-27T14:13:04.957Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d6/4e/f9bbf3a36c061e2e0e0f78369c006d66416561a33d2bee63345aee8ee65e/coverage-7.10.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:9d6f494c307e5cb9b1e052ec1a471060f1dea092c8116e642e7a23e79d9388ea", size = 258794, upload-time = "2025-07-27T14:13:06.715Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/87/82/e600bbe78eb2cb0541751d03cef9314bcd0897e8eea156219c39b685f869/coverage-7.10.1-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:fc0e46d86905ddd16b85991f1f4919028092b4e511689bbdaff0876bd8aab3dd", size = 256869, upload-time = "2025-07-27T14:13:08.933Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ce/5d/2fc9a9236c5268f68ac011d97cd3a5ad16cc420535369bedbda659fdd9b7/coverage-7.10.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:80b9ccd82e30038b61fc9a692a8dc4801504689651b281ed9109f10cc9fe8b4d", size = 257765, upload-time = "2025-07-27T14:13:10.778Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8a/05/b4e00b2bd48a2dc8e1c7d2aea7455f40af2e36484ab2ef06deb85883e9fe/coverage-7.10.1-cp314-cp314t-win32.whl", hash = "sha256:e58991a2b213417285ec866d3cd32db17a6a88061a985dbb7e8e8f13af429c47", size = 218420, upload-time = "2025-07-27T14:13:12.882Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/77/fb/d21d05f33ea27ece327422240e69654b5932b0b29e7fbc40fbab3cf199bf/coverage-7.10.1-cp314-cp314t-win_amd64.whl", hash = "sha256:e88dd71e4ecbc49d9d57d064117462c43f40a21a1383507811cf834a4a620651", size = 219536, upload-time = "2025-07-27T14:13:14.718Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a6/68/7fea94b141281ed8be3d1d5c4319a97f2befc3e487ce33657fc64db2c45e/coverage-7.10.1-cp314-cp314t-win_arm64.whl", hash = "sha256:1aadfb06a30c62c2eb82322171fe1f7c288c80ca4156d46af0ca039052814bab", size = 217190, upload-time = "2025-07-27T14:13:16.85Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0f/64/922899cff2c0fd3496be83fa8b81230f5a8d82a2ad30f98370b133c2c83b/coverage-7.10.1-py3-none-any.whl", hash = "sha256:fa2a258aa6bf188eb9a8948f7102a83da7c430a0dce918dbd8b60ef8fcb772d7", size = 206597, upload-time = "2025-07-27T14:13:37.221Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
@ -147,16 +169,16 @@ wheels = [
|
|||
|
||||
[[package]]
|
||||
name = "fastapi"
|
||||
version = "0.116.0"
|
||||
version = "0.116.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "pydantic" },
|
||||
{ name = "starlette" },
|
||||
{ name = "typing-extensions" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/20/38/e1da78736143fd885c36213a3ccc493c384ae8fea6a0f0bc272ef42ebea8/fastapi-0.116.0.tar.gz", hash = "sha256:80dc0794627af0390353a6d1171618276616310d37d24faba6648398e57d687a", size = 296518, upload-time = "2025-07-07T15:09:27.82Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/78/d7/6c8b3bfe33eeffa208183ec037fee0cce9f7f024089ab1c5d12ef04bd27c/fastapi-0.116.1.tar.gz", hash = "sha256:ed52cbf946abfd70c5a0dccb24673f0670deeb517a88b3544d03c2a6bf283143", size = 296485, upload-time = "2025-07-11T16:22:32.057Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/2f/68/d80347fe2360445b5f58cf290e588a4729746e7501080947e6cdae114b1f/fastapi-0.116.0-py3-none-any.whl", hash = "sha256:fdcc9ed272eaef038952923bef2b735c02372402d1203ee1210af4eea7a78d2b", size = 95625, upload-time = "2025-07-07T15:09:26.348Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e5/47/d63c60f59a59467fda0f93f46335c9d18526d7071f025cb5b89d5353ea42/fastapi-0.116.1-py3-none-any.whl", hash = "sha256:c46ac7c312df840f0c9e220f7964bada936781bc4e2e6eb71f1c4d7553786565", size = 95631, upload-time = "2025-07-11T16:22:30.485Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
@ -321,33 +343,90 @@ wheels = [
|
|||
]
|
||||
|
||||
[[package]]
|
||||
name = "numpy"
|
||||
version = "2.3.1"
|
||||
name = "mypy"
|
||||
version = "1.17.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/2e/19/d7c972dfe90a353dbd3efbbe1d14a5951de80c99c9dc1b93cd998d51dc0f/numpy-2.3.1.tar.gz", hash = "sha256:1ec9ae20a4226da374362cca3c62cd753faf2f951440b0e3b98e93c235441d2b", size = 20390372, upload-time = "2025-06-21T12:28:33.469Z" }
|
||||
dependencies = [
|
||||
{ name = "mypy-extensions" },
|
||||
{ name = "pathspec" },
|
||||
{ name = "typing-extensions" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/8e/22/ea637422dedf0bf36f3ef238eab4e455e2a0dcc3082b5cc067615347ab8e/mypy-1.17.1.tar.gz", hash = "sha256:25e01ec741ab5bb3eec8ba9cdb0f769230368a22c959c4937360efb89b7e9f01", size = 3352570, upload-time = "2025-07-31T07:54:19.204Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d4/bd/35ad97006d8abff8631293f8ea6adf07b0108ce6fec68da3c3fcca1197f2/numpy-2.3.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:25a1992b0a3fdcdaec9f552ef10d8103186f5397ab45e2d25f8ac51b1a6b97e8", size = 20889381, upload-time = "2025-06-21T12:19:04.103Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f1/4f/df5923874d8095b6062495b39729178eef4a922119cee32a12ee1bd4664c/numpy-2.3.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7dea630156d39b02a63c18f508f85010230409db5b2927ba59c8ba4ab3e8272e", size = 14152726, upload-time = "2025-06-21T12:19:25.599Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8c/0f/a1f269b125806212a876f7efb049b06c6f8772cf0121139f97774cd95626/numpy-2.3.1-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:bada6058dd886061f10ea15f230ccf7dfff40572e99fef440a4a857c8728c9c0", size = 5105145, upload-time = "2025-06-21T12:19:34.782Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6d/63/a7f7fd5f375b0361682f6ffbf686787e82b7bbd561268e4f30afad2bb3c0/numpy-2.3.1-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:a894f3816eb17b29e4783e5873f92faf55b710c2519e5c351767c51f79d8526d", size = 6639409, upload-time = "2025-06-21T12:19:45.228Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bf/0d/1854a4121af895aab383f4aa233748f1df4671ef331d898e32426756a8a6/numpy-2.3.1-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:18703df6c4a4fee55fd3d6e5a253d01c5d33a295409b03fda0c86b3ca2ff41a1", size = 14257630, upload-time = "2025-06-21T12:20:06.544Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/50/30/af1b277b443f2fb08acf1c55ce9d68ee540043f158630d62cef012750f9f/numpy-2.3.1-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:5902660491bd7a48b2ec16c23ccb9124b8abfd9583c5fdfa123fe6b421e03de1", size = 16627546, upload-time = "2025-06-21T12:20:31.002Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6e/ec/3b68220c277e463095342d254c61be8144c31208db18d3fd8ef02712bcd6/numpy-2.3.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:36890eb9e9d2081137bd78d29050ba63b8dab95dff7912eadf1185e80074b2a0", size = 15562538, upload-time = "2025-06-21T12:20:54.322Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/77/2b/4014f2bcc4404484021c74d4c5ee8eb3de7e3f7ac75f06672f8dcf85140a/numpy-2.3.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a780033466159c2270531e2b8ac063704592a0bc62ec4a1b991c7c40705eb0e8", size = 18360327, upload-time = "2025-06-21T12:21:21.053Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/40/8d/2ddd6c9b30fcf920837b8672f6c65590c7d92e43084c25fc65edc22e93ca/numpy-2.3.1-cp313-cp313-win32.whl", hash = "sha256:39bff12c076812595c3a306f22bfe49919c5513aa1e0e70fac756a0be7c2a2b8", size = 6312330, upload-time = "2025-06-21T12:25:07.447Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/dd/c8/beaba449925988d415efccb45bf977ff8327a02f655090627318f6398c7b/numpy-2.3.1-cp313-cp313-win_amd64.whl", hash = "sha256:8d5ee6eec45f08ce507a6570e06f2f879b374a552087a4179ea7838edbcbfa42", size = 12731565, upload-time = "2025-06-21T12:25:26.444Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0b/c3/5c0c575d7ec78c1126998071f58facfc124006635da75b090805e642c62e/numpy-2.3.1-cp313-cp313-win_arm64.whl", hash = "sha256:0c4d9e0a8368db90f93bd192bfa771ace63137c3488d198ee21dfb8e7771916e", size = 10190262, upload-time = "2025-06-21T12:25:42.196Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ea/19/a029cd335cf72f79d2644dcfc22d90f09caa86265cbbde3b5702ccef6890/numpy-2.3.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:b0b5397374f32ec0649dd98c652a1798192042e715df918c20672c62fb52d4b8", size = 20987593, upload-time = "2025-06-21T12:21:51.664Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/25/91/8ea8894406209107d9ce19b66314194675d31761fe2cb3c84fe2eeae2f37/numpy-2.3.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:c5bdf2015ccfcee8253fb8be695516ac4457c743473a43290fd36eba6a1777eb", size = 14300523, upload-time = "2025-06-21T12:22:13.583Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a6/7f/06187b0066eefc9e7ce77d5f2ddb4e314a55220ad62dd0bfc9f2c44bac14/numpy-2.3.1-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:d70f20df7f08b90a2062c1f07737dd340adccf2068d0f1b9b3d56e2038979fee", size = 5227993, upload-time = "2025-06-21T12:22:22.53Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e8/ec/a926c293c605fa75e9cfb09f1e4840098ed46d2edaa6e2152ee35dc01ed3/numpy-2.3.1-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:2fb86b7e58f9ac50e1e9dd1290154107e47d1eef23a0ae9145ded06ea606f992", size = 6736652, upload-time = "2025-06-21T12:22:33.629Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e3/62/d68e52fb6fde5586650d4c0ce0b05ff3a48ad4df4ffd1b8866479d1d671d/numpy-2.3.1-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:23ab05b2d241f76cb883ce8b9a93a680752fbfcbd51c50eff0b88b979e471d8c", size = 14331561, upload-time = "2025-06-21T12:22:55.056Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fc/ec/b74d3f2430960044bdad6900d9f5edc2dc0fb8bf5a0be0f65287bf2cbe27/numpy-2.3.1-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:ce2ce9e5de4703a673e705183f64fd5da5bf36e7beddcb63a25ee2286e71ca48", size = 16693349, upload-time = "2025-06-21T12:23:20.53Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0d/15/def96774b9d7eb198ddadfcbd20281b20ebb510580419197e225f5c55c3e/numpy-2.3.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:c4913079974eeb5c16ccfd2b1f09354b8fed7e0d6f2cab933104a09a6419b1ee", size = 15642053, upload-time = "2025-06-21T12:23:43.697Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/57/c3203974762a759540c6ae71d0ea2341c1fa41d84e4971a8e76d7141678a/numpy-2.3.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:010ce9b4f00d5c036053ca684c77441f2f2c934fd23bee058b4d6f196efd8280", size = 18434184, upload-time = "2025-06-21T12:24:10.708Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/22/8a/ccdf201457ed8ac6245187850aff4ca56a79edbea4829f4e9f14d46fa9a5/numpy-2.3.1-cp313-cp313t-win32.whl", hash = "sha256:6269b9edfe32912584ec496d91b00b6d34282ca1d07eb10e82dfc780907d6c2e", size = 6440678, upload-time = "2025-06-21T12:24:21.596Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f1/7e/7f431d8bd8eb7e03d79294aed238b1b0b174b3148570d03a8a8a8f6a0da9/numpy-2.3.1-cp313-cp313t-win_amd64.whl", hash = "sha256:2a809637460e88a113e186e87f228d74ae2852a2e0c44de275263376f17b5bdc", size = 12870697, upload-time = "2025-06-21T12:24:40.644Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d4/ca/af82bf0fad4c3e573c6930ed743b5308492ff19917c7caaf2f9b6f9e2e98/numpy-2.3.1-cp313-cp313t-win_arm64.whl", hash = "sha256:eccb9a159db9aed60800187bc47a6d3451553f0e1b08b068d8b277ddfbb9b244", size = 10260376, upload-time = "2025-06-21T12:24:56.884Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5b/82/aec2fc9b9b149f372850291827537a508d6c4d3664b1750a324b91f71355/mypy-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:93378d3203a5c0800c6b6d850ad2f19f7a3cdf1a3701d3416dbf128805c6a6a7", size = 11075338, upload-time = "2025-07-31T07:53:38.873Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/07/ac/ee93fbde9d2242657128af8c86f5d917cd2887584cf948a8e3663d0cd737/mypy-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:15d54056f7fe7a826d897789f53dd6377ec2ea8ba6f776dc83c2902b899fee81", size = 10113066, upload-time = "2025-07-31T07:54:14.707Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5a/68/946a1e0be93f17f7caa56c45844ec691ca153ee8b62f21eddda336a2d203/mypy-1.17.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:209a58fed9987eccc20f2ca94afe7257a8f46eb5df1fb69958650973230f91e6", size = 11875473, upload-time = "2025-07-31T07:53:14.504Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9f/0f/478b4dce1cb4f43cf0f0d00fba3030b21ca04a01b74d1cd272a528cf446f/mypy-1.17.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:099b9a5da47de9e2cb5165e581f158e854d9e19d2e96b6698c0d64de911dd849", size = 12744296, upload-time = "2025-07-31T07:53:03.896Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ca/70/afa5850176379d1b303f992a828de95fc14487429a7139a4e0bdd17a8279/mypy-1.17.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:fa6ffadfbe6994d724c5a1bb6123a7d27dd68fc9c059561cd33b664a79578e14", size = 12914657, upload-time = "2025-07-31T07:54:08.576Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/53/f9/4a83e1c856a3d9c8f6edaa4749a4864ee98486e9b9dbfbc93842891029c2/mypy-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:9a2b7d9180aed171f033c9f2fc6c204c1245cf60b0cb61cf2e7acc24eea78e0a", size = 9593320, upload-time = "2025-07-31T07:53:01.341Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/38/56/79c2fac86da57c7d8c48622a05873eaab40b905096c33597462713f5af90/mypy-1.17.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:15a83369400454c41ed3a118e0cc58bd8123921a602f385cb6d6ea5df050c733", size = 11040037, upload-time = "2025-07-31T07:54:10.942Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4d/c3/adabe6ff53638e3cad19e3547268482408323b1e68bf082c9119000cd049/mypy-1.17.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:55b918670f692fc9fba55c3298d8a3beae295c5cded0a55dccdc5bbead814acd", size = 10131550, upload-time = "2025-07-31T07:53:41.307Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b8/c5/2e234c22c3bdeb23a7817af57a58865a39753bde52c74e2c661ee0cfc640/mypy-1.17.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:62761474061feef6f720149d7ba876122007ddc64adff5ba6f374fda35a018a0", size = 11872963, upload-time = "2025-07-31T07:53:16.878Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ab/26/c13c130f35ca8caa5f2ceab68a247775648fdcd6c9a18f158825f2bc2410/mypy-1.17.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c49562d3d908fd49ed0938e5423daed8d407774a479b595b143a3d7f87cdae6a", size = 12710189, upload-time = "2025-07-31T07:54:01.962Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/82/df/c7d79d09f6de8383fe800521d066d877e54d30b4fb94281c262be2df84ef/mypy-1.17.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:397fba5d7616a5bc60b45c7ed204717eaddc38f826e3645402c426057ead9a91", size = 12900322, upload-time = "2025-07-31T07:53:10.551Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b8/98/3d5a48978b4f708c55ae832619addc66d677f6dc59f3ebad71bae8285ca6/mypy-1.17.1-cp314-cp314-win_amd64.whl", hash = "sha256:9d6b20b97d373f41617bd0708fd46aa656059af57f2ef72aa8c7d6a2b73b74ed", size = 9751879, upload-time = "2025-07-31T07:52:56.683Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1d/f3/8fcd2af0f5b806f6cf463efaffd3c9548a28f84220493ecd38d127b6b66d/mypy-1.17.1-py3-none-any.whl", hash = "sha256:a9f52c0351c21fe24c21d8c0eb1f62967b262d6729393397b6f443c3b773c3b9", size = 2283411, upload-time = "2025-07-31T07:53:24.664Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "mypy-extensions"
|
||||
version = "1.1.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/a2/6e/371856a3fb9d31ca8dac321cda606860fa4548858c0cc45d9d1d4ca2628b/mypy_extensions-1.1.0.tar.gz", hash = "sha256:52e68efc3284861e772bbcd66823fde5ae21fd2fdb51c62a211403730b916558", size = 6343, upload-time = "2025-04-22T14:54:24.164Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/79/7b/2c79738432f5c924bef5071f933bcc9efd0473bac3b4aa584a6f7c1c8df8/mypy_extensions-1.1.0-py3-none-any.whl", hash = "sha256:1be4cccdb0f2482337c4743e60421de3a356cd97508abadd57d47403e94f5505", size = 4963, upload-time = "2025-04-22T14:54:22.983Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "numpy"
|
||||
version = "2.3.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/37/7d/3fec4199c5ffb892bed55cff901e4f39a58c81df9c44c280499e92cad264/numpy-2.3.2.tar.gz", hash = "sha256:e0486a11ec30cdecb53f184d496d1c6a20786c81e55e41640270130056f8ee48", size = 20489306, upload-time = "2025-07-24T21:32:07.553Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/1c/c0/c6bb172c916b00700ed3bf71cb56175fd1f7dbecebf8353545d0b5519f6c/numpy-2.3.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:c8d9727f5316a256425892b043736d63e89ed15bbfe6556c5ff4d9d4448ff3b3", size = 20949074, upload-time = "2025-07-24T20:43:07.813Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/20/4e/c116466d22acaf4573e58421c956c6076dc526e24a6be0903219775d862e/numpy-2.3.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:efc81393f25f14d11c9d161e46e6ee348637c0a1e8a54bf9dedc472a3fae993b", size = 14177311, upload-time = "2025-07-24T20:43:29.335Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/78/45/d4698c182895af189c463fc91d70805d455a227261d950e4e0f1310c2550/numpy-2.3.2-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:dd937f088a2df683cbb79dda9a772b62a3e5a8a7e76690612c2737f38c6ef1b6", size = 5106022, upload-time = "2025-07-24T20:43:37.999Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9f/76/3e6880fef4420179309dba72a8c11f6166c431cf6dee54c577af8906f914/numpy-2.3.2-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:11e58218c0c46c80509186e460d79fbdc9ca1eb8d8aee39d8f2dc768eb781089", size = 6640135, upload-time = "2025-07-24T20:43:49.28Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/34/fa/87ff7f25b3c4ce9085a62554460b7db686fef1e0207e8977795c7b7d7ba1/numpy-2.3.2-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5ad4ebcb683a1f99f4f392cc522ee20a18b2bb12a2c1c42c3d48d5a1adc9d3d2", size = 14278147, upload-time = "2025-07-24T20:44:10.328Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1d/0f/571b2c7a3833ae419fe69ff7b479a78d313581785203cc70a8db90121b9a/numpy-2.3.2-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:938065908d1d869c7d75d8ec45f735a034771c6ea07088867f713d1cd3bbbe4f", size = 16635989, upload-time = "2025-07-24T20:44:34.88Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/24/5a/84ae8dca9c9a4c592fe11340b36a86ffa9fd3e40513198daf8a97839345c/numpy-2.3.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:66459dccc65d8ec98cc7df61307b64bf9e08101f9598755d42d8ae65d9a7a6ee", size = 16053052, upload-time = "2025-07-24T20:44:58.872Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/57/7c/e5725d99a9133b9813fcf148d3f858df98511686e853169dbaf63aec6097/numpy-2.3.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a7af9ed2aa9ec5950daf05bb11abc4076a108bd3c7db9aa7251d5f107079b6a6", size = 18577955, upload-time = "2025-07-24T20:45:26.714Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ae/11/7c546fcf42145f29b71e4d6f429e96d8d68e5a7ba1830b2e68d7418f0bbd/numpy-2.3.2-cp313-cp313-win32.whl", hash = "sha256:906a30249315f9c8e17b085cc5f87d3f369b35fedd0051d4a84686967bdbbd0b", size = 6311843, upload-time = "2025-07-24T20:49:24.444Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/aa/6f/a428fd1cb7ed39b4280d057720fed5121b0d7754fd2a9768640160f5517b/numpy-2.3.2-cp313-cp313-win_amd64.whl", hash = "sha256:c63d95dc9d67b676e9108fe0d2182987ccb0f11933c1e8959f42fa0da8d4fa56", size = 12782876, upload-time = "2025-07-24T20:49:43.227Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/65/85/4ea455c9040a12595fb6c43f2c217257c7b52dd0ba332c6a6c1d28b289fe/numpy-2.3.2-cp313-cp313-win_arm64.whl", hash = "sha256:b05a89f2fb84d21235f93de47129dd4f11c16f64c87c33f5e284e6a3a54e43f2", size = 10192786, upload-time = "2025-07-24T20:49:59.443Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/80/23/8278f40282d10c3f258ec3ff1b103d4994bcad78b0cba9208317f6bb73da/numpy-2.3.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:4e6ecfeddfa83b02318f4d84acf15fbdbf9ded18e46989a15a8b6995dfbf85ab", size = 21047395, upload-time = "2025-07-24T20:45:58.821Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1f/2d/624f2ce4a5df52628b4ccd16a4f9437b37c35f4f8a50d00e962aae6efd7a/numpy-2.3.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:508b0eada3eded10a3b55725b40806a4b855961040180028f52580c4729916a2", size = 14300374, upload-time = "2025-07-24T20:46:20.207Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f6/62/ff1e512cdbb829b80a6bd08318a58698867bca0ca2499d101b4af063ee97/numpy-2.3.2-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:754d6755d9a7588bdc6ac47dc4ee97867271b17cee39cb87aef079574366db0a", size = 5228864, upload-time = "2025-07-24T20:46:30.58Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7d/8e/74bc18078fff03192d4032cfa99d5a5ca937807136d6f5790ce07ca53515/numpy-2.3.2-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:a9f66e7d2b2d7712410d3bc5684149040ef5f19856f20277cd17ea83e5006286", size = 6737533, upload-time = "2025-07-24T20:46:46.111Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/19/ea/0731efe2c9073ccca5698ef6a8c3667c4cf4eea53fcdcd0b50140aba03bc/numpy-2.3.2-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:de6ea4e5a65d5a90c7d286ddff2b87f3f4ad61faa3db8dabe936b34c2275b6f8", size = 14352007, upload-time = "2025-07-24T20:47:07.1Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cf/90/36be0865f16dfed20f4bc7f75235b963d5939707d4b591f086777412ff7b/numpy-2.3.2-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a3ef07ec8cbc8fc9e369c8dcd52019510c12da4de81367d8b20bc692aa07573a", size = 16701914, upload-time = "2025-07-24T20:47:32.459Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/94/30/06cd055e24cb6c38e5989a9e747042b4e723535758e6153f11afea88c01b/numpy-2.3.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:27c9f90e7481275c7800dc9c24b7cc40ace3fdb970ae4d21eaff983a32f70c91", size = 16132708, upload-time = "2025-07-24T20:47:58.129Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9a/14/ecede608ea73e58267fd7cb78f42341b3b37ba576e778a1a06baffbe585c/numpy-2.3.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:07b62978075b67eee4065b166d000d457c82a1efe726cce608b9db9dd66a73a5", size = 18651678, upload-time = "2025-07-24T20:48:25.402Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/40/f3/2fe6066b8d07c3685509bc24d56386534c008b462a488b7f503ba82b8923/numpy-2.3.2-cp313-cp313t-win32.whl", hash = "sha256:c771cfac34a4f2c0de8e8c97312d07d64fd8f8ed45bc9f5726a7e947270152b5", size = 6441832, upload-time = "2025-07-24T20:48:37.181Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0b/ba/0937d66d05204d8f28630c9c60bc3eda68824abde4cf756c4d6aad03b0c6/numpy-2.3.2-cp313-cp313t-win_amd64.whl", hash = "sha256:72dbebb2dcc8305c431b2836bcc66af967df91be793d63a24e3d9b741374c450", size = 12927049, upload-time = "2025-07-24T20:48:56.24Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e9/ed/13542dd59c104d5e654dfa2ac282c199ba64846a74c2c4bcdbc3a0f75df1/numpy-2.3.2-cp313-cp313t-win_arm64.whl", hash = "sha256:72c6df2267e926a6d5286b0a6d556ebe49eae261062059317837fda12ddf0c1a", size = 10262935, upload-time = "2025-07-24T20:49:13.136Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c9/7c/7659048aaf498f7611b783e000c7268fcc4dcf0ce21cd10aad7b2e8f9591/numpy-2.3.2-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:448a66d052d0cf14ce9865d159bfc403282c9bc7bb2a31b03cc18b651eca8b1a", size = 20950906, upload-time = "2025-07-24T20:50:30.346Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/80/db/984bea9d4ddf7112a04cfdfb22b1050af5757864cfffe8e09e44b7f11a10/numpy-2.3.2-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:546aaf78e81b4081b2eba1d105c3b34064783027a06b3ab20b6eba21fb64132b", size = 14185607, upload-time = "2025-07-24T20:50:51.923Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e4/76/b3d6f414f4eca568f469ac112a3b510938d892bc5a6c190cb883af080b77/numpy-2.3.2-cp314-cp314-macosx_14_0_arm64.whl", hash = "sha256:87c930d52f45df092f7578889711a0768094debf73cfcde105e2d66954358125", size = 5114110, upload-time = "2025-07-24T20:51:01.041Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9e/d2/6f5e6826abd6bca52392ed88fe44a4b52aacb60567ac3bc86c67834c3a56/numpy-2.3.2-cp314-cp314-macosx_14_0_x86_64.whl", hash = "sha256:8dc082ea901a62edb8f59713c6a7e28a85daddcb67454c839de57656478f5b19", size = 6642050, upload-time = "2025-07-24T20:51:11.64Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c4/43/f12b2ade99199e39c73ad182f103f9d9791f48d885c600c8e05927865baf/numpy-2.3.2-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:af58de8745f7fa9ca1c0c7c943616c6fe28e75d0c81f5c295810e3c83b5be92f", size = 14296292, upload-time = "2025-07-24T20:51:33.488Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5d/f9/77c07d94bf110a916b17210fac38680ed8734c236bfed9982fd8524a7b47/numpy-2.3.2-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fed5527c4cf10f16c6d0b6bee1f89958bccb0ad2522c8cadc2efd318bcd545f5", size = 16638913, upload-time = "2025-07-24T20:51:58.517Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9b/d1/9d9f2c8ea399cc05cfff8a7437453bd4e7d894373a93cdc46361bbb49a7d/numpy-2.3.2-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:095737ed986e00393ec18ec0b21b47c22889ae4b0cd2d5e88342e08b01141f58", size = 16071180, upload-time = "2025-07-24T20:52:22.827Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4c/41/82e2c68aff2a0c9bf315e47d61951099fed65d8cb2c8d9dc388cb87e947e/numpy-2.3.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:b5e40e80299607f597e1a8a247ff8d71d79c5b52baa11cc1cce30aa92d2da6e0", size = 18576809, upload-time = "2025-07-24T20:52:51.015Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/14/14/4b4fd3efb0837ed252d0f583c5c35a75121038a8c4e065f2c259be06d2d8/numpy-2.3.2-cp314-cp314-win32.whl", hash = "sha256:7d6e390423cc1f76e1b8108c9b6889d20a7a1f59d9a60cac4a050fa734d6c1e2", size = 6366410, upload-time = "2025-07-24T20:56:44.949Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/11/9e/b4c24a6b8467b61aced5c8dc7dcfce23621baa2e17f661edb2444a418040/numpy-2.3.2-cp314-cp314-win_amd64.whl", hash = "sha256:b9d0878b21e3918d76d2209c924ebb272340da1fb51abc00f986c258cd5e957b", size = 12918821, upload-time = "2025-07-24T20:57:06.479Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0e/0f/0dc44007c70b1007c1cef86b06986a3812dd7106d8f946c09cfa75782556/numpy-2.3.2-cp314-cp314-win_arm64.whl", hash = "sha256:2738534837c6a1d0c39340a190177d7d66fdf432894f469728da901f8f6dc910", size = 10477303, upload-time = "2025-07-24T20:57:22.879Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8b/3e/075752b79140b78ddfc9c0a1634d234cfdbc6f9bbbfa6b7504e445ad7d19/numpy-2.3.2-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:4d002ecf7c9b53240be3bb69d80f86ddbd34078bae04d87be81c1f58466f264e", size = 21047524, upload-time = "2025-07-24T20:53:22.086Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fe/6d/60e8247564a72426570d0e0ea1151b95ce5bd2f1597bb878a18d32aec855/numpy-2.3.2-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:293b2192c6bcce487dbc6326de5853787f870aeb6c43f8f9c6496db5b1781e45", size = 14300519, upload-time = "2025-07-24T20:53:44.053Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4d/73/d8326c442cd428d47a067070c3ac6cc3b651a6e53613a1668342a12d4479/numpy-2.3.2-cp314-cp314t-macosx_14_0_arm64.whl", hash = "sha256:0a4f2021a6da53a0d580d6ef5db29947025ae8b35b3250141805ea9a32bbe86b", size = 5228972, upload-time = "2025-07-24T20:53:53.81Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/34/2e/e71b2d6dad075271e7079db776196829019b90ce3ece5c69639e4f6fdc44/numpy-2.3.2-cp314-cp314t-macosx_14_0_x86_64.whl", hash = "sha256:9c144440db4bf3bb6372d2c3e49834cc0ff7bb4c24975ab33e01199e645416f2", size = 6737439, upload-time = "2025-07-24T20:54:04.742Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/15/b0/d004bcd56c2c5e0500ffc65385eb6d569ffd3363cb5e593ae742749b2daa/numpy-2.3.2-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f92d6c2a8535dc4fe4419562294ff957f83a16ebdec66df0805e473ffaad8bd0", size = 14352479, upload-time = "2025-07-24T20:54:25.819Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/11/e3/285142fcff8721e0c99b51686426165059874c150ea9ab898e12a492e291/numpy-2.3.2-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:cefc2219baa48e468e3db7e706305fcd0c095534a192a08f31e98d83a7d45fb0", size = 16702805, upload-time = "2025-07-24T20:54:50.814Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/33/c3/33b56b0e47e604af2c7cd065edca892d180f5899599b76830652875249a3/numpy-2.3.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:76c3e9501ceb50b2ff3824c3589d5d1ab4ac857b0ee3f8f49629d0de55ecf7c2", size = 16133830, upload-time = "2025-07-24T20:55:17.306Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6e/ae/7b1476a1f4d6a48bc669b8deb09939c56dd2a439db1ab03017844374fb67/numpy-2.3.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:122bf5ed9a0221b3419672493878ba4967121514b1d7d4656a7580cd11dddcbf", size = 18652665, upload-time = "2025-07-24T20:55:46.665Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/14/ba/5b5c9978c4bb161034148ade2de9db44ec316fab89ce8c400db0e0c81f86/numpy-2.3.2-cp314-cp314t-win32.whl", hash = "sha256:6f1ae3dcb840edccc45af496f312528c15b1f79ac318169d094e85e4bb35fdf1", size = 6514777, upload-time = "2025-07-24T20:55:57.66Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/eb/46/3dbaf0ae7c17cdc46b9f662c56da2054887b8d9e737c1476f335c83d33db/numpy-2.3.2-cp314-cp314t-win_amd64.whl", hash = "sha256:087ffc25890d89a43536f75c5fe8770922008758e8eeeef61733957041ed2f9b", size = 13111856, upload-time = "2025-07-24T20:56:17.318Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c1/9e/1652778bce745a67b5fe05adde60ed362d38eb17d919a540e813d30f6874/numpy-2.3.2-cp314-cp314t-win_arm64.whl", hash = "sha256:092aeb3449833ea9c0bf0089d70c29ae480685dd2377ec9cdbbb620257f84631", size = 10544226, upload-time = "2025-07-24T20:56:34.509Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
@ -373,12 +452,17 @@ dependencies = [
|
|||
[package.dev-dependencies]
|
||||
dev = [
|
||||
{ name = "httpx" },
|
||||
{ name = "mypy" },
|
||||
{ name = "pytest" },
|
||||
{ name = "pytest-asyncio" },
|
||||
{ name = "pytest-cov" },
|
||||
{ name = "pytest-mock" },
|
||||
{ name = "respx" },
|
||||
{ name = "ruff" },
|
||||
{ name = "types-python-dateutil" },
|
||||
{ name = "types-pytz" },
|
||||
{ name = "types-pyyaml" },
|
||||
{ name = "types-requests" },
|
||||
]
|
||||
|
||||
[package.metadata]
|
||||
|
@ -401,12 +485,17 @@ requires-dist = [
|
|||
[package.metadata.requires-dev]
|
||||
dev = [
|
||||
{ name = "httpx", specifier = ">=0" },
|
||||
{ name = "mypy", specifier = ">=1" },
|
||||
{ name = "pytest", specifier = ">=8" },
|
||||
{ name = "pytest-asyncio", specifier = ">=0" },
|
||||
{ name = "pytest-cov", specifier = ">=0" },
|
||||
{ name = "pytest-mock", specifier = ">=0" },
|
||||
{ name = "respx", specifier = ">=0" },
|
||||
{ name = "ruff", specifier = ">=0" },
|
||||
{ name = "types-python-dateutil", specifier = ">=2" },
|
||||
{ name = "types-pytz", specifier = ">=2025" },
|
||||
{ name = "types-pyyaml", specifier = ">=6" },
|
||||
{ name = "types-requests", specifier = ">=2" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
@ -418,6 +507,15 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484", size = 66469, upload-time = "2025-04-19T11:48:57.875Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pathspec"
|
||||
version = "0.12.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/ca/bc/f35b8446f4531a7cb215605d100cd88b7ac6f44ab3fc94870c120ab3adbf/pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712", size = 51043, upload-time = "2023-12-10T22:30:45Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08", size = 31191, upload-time = "2023-12-10T22:30:43.14Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pluggy"
|
||||
version = "1.6.0"
|
||||
|
@ -511,14 +609,14 @@ wheels = [
|
|||
|
||||
[[package]]
|
||||
name = "pytest-asyncio"
|
||||
version = "1.0.0"
|
||||
version = "1.1.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "pytest" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/d0/d4/14f53324cb1a6381bef29d698987625d80052bb33932d8e7cbf9b337b17c/pytest_asyncio-1.0.0.tar.gz", hash = "sha256:d15463d13f4456e1ead2594520216b225a16f781e144f8fdf6c5bb4667c48b3f", size = 46960, upload-time = "2025-05-26T04:54:40.484Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/4e/51/f8794af39eeb870e87a8c8068642fc07bce0c854d6865d7dd0f2a9d338c2/pytest_asyncio-1.1.0.tar.gz", hash = "sha256:796aa822981e01b68c12e4827b8697108f7205020f24b5793b3c41555dab68ea", size = 46652, upload-time = "2025-07-16T04:29:26.393Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/30/05/ce271016e351fddc8399e546f6e23761967ee09c8c568bbfbecb0c150171/pytest_asyncio-1.0.0-py3-none-any.whl", hash = "sha256:4f024da9f1ef945e680dc68610b52550e36590a67fd31bb3b4943979a1f90ef3", size = 15976, upload-time = "2025-05-26T04:54:39.035Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c7/9d/bf86eddabf8c6c9cb1ea9a869d6873b46f105a5d292d3a6f7071f5b07935/pytest_asyncio-1.1.0-py3-none-any.whl", hash = "sha256:5fe2d69607b0bd75c656d1211f969cadba035030156745ee09e7d71740e58ecf", size = 15157, upload-time = "2025-07-16T04:29:24.929Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
@ -605,25 +703,38 @@ wheels = [
|
|||
|
||||
[[package]]
|
||||
name = "regex"
|
||||
version = "2024.11.6"
|
||||
version = "2025.7.34"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/8e/5f/bd69653fbfb76cf8604468d3b4ec4c403197144c7bfe0e6a5fc9e02a07cb/regex-2024.11.6.tar.gz", hash = "sha256:7ab159b063c52a0333c884e4679f8d7a85112ee3078fe3d9004b2dd875585519", size = 399494, upload-time = "2024-11-06T20:12:31.635Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/0b/de/e13fa6dc61d78b30ba47481f99933a3b49a57779d625c392d8036770a60d/regex-2025.7.34.tar.gz", hash = "sha256:9ead9765217afd04a86822dfcd4ed2747dfe426e887da413b15ff0ac2457e21a", size = 400714, upload-time = "2025-07-31T00:21:16.262Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/90/73/bcb0e36614601016552fa9344544a3a2ae1809dc1401b100eab02e772e1f/regex-2024.11.6-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a6ba92c0bcdf96cbf43a12c717eae4bc98325ca3730f6b130ffa2e3c3c723d84", size = 483525, upload-time = "2024-11-06T20:10:45.19Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0f/3f/f1a082a46b31e25291d830b369b6b0c5576a6f7fb89d3053a354c24b8a83/regex-2024.11.6-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:525eab0b789891ac3be914d36893bdf972d483fe66551f79d3e27146191a37d4", size = 288324, upload-time = "2024-11-06T20:10:47.177Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/09/c9/4e68181a4a652fb3ef5099e077faf4fd2a694ea6e0f806a7737aff9e758a/regex-2024.11.6-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:086a27a0b4ca227941700e0b31425e7a28ef1ae8e5e05a33826e17e47fbfdba0", size = 284617, upload-time = "2024-11-06T20:10:49.312Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fc/fd/37868b75eaf63843165f1d2122ca6cb94bfc0271e4428cf58c0616786dce/regex-2024.11.6-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bde01f35767c4a7899b7eb6e823b125a64de314a8ee9791367c9a34d56af18d0", size = 795023, upload-time = "2024-11-06T20:10:51.102Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c4/7c/d4cd9c528502a3dedb5c13c146e7a7a539a3853dc20209c8e75d9ba9d1b2/regex-2024.11.6-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b583904576650166b3d920d2bcce13971f6f9e9a396c673187f49811b2769dc7", size = 833072, upload-time = "2024-11-06T20:10:52.926Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4f/db/46f563a08f969159c5a0f0e722260568425363bea43bb7ae370becb66a67/regex-2024.11.6-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1c4de13f06a0d54fa0d5ab1b7138bfa0d883220965a29616e3ea61b35d5f5fc7", size = 823130, upload-time = "2024-11-06T20:10:54.828Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/db/60/1eeca2074f5b87df394fccaa432ae3fc06c9c9bfa97c5051aed70e6e00c2/regex-2024.11.6-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3cde6e9f2580eb1665965ce9bf17ff4952f34f5b126beb509fee8f4e994f143c", size = 796857, upload-time = "2024-11-06T20:10:56.634Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/10/db/ac718a08fcee981554d2f7bb8402f1faa7e868c1345c16ab1ebec54b0d7b/regex-2024.11.6-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0d7f453dca13f40a02b79636a339c5b62b670141e63efd511d3f8f73fba162b3", size = 784006, upload-time = "2024-11-06T20:10:59.369Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c2/41/7da3fe70216cea93144bf12da2b87367590bcf07db97604edeea55dac9ad/regex-2024.11.6-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:59dfe1ed21aea057a65c6b586afd2a945de04fc7db3de0a6e3ed5397ad491b07", size = 781650, upload-time = "2024-11-06T20:11:02.042Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a7/d5/880921ee4eec393a4752e6ab9f0fe28009435417c3102fc413f3fe81c4e5/regex-2024.11.6-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:b97c1e0bd37c5cd7902e65f410779d39eeda155800b65fc4d04cc432efa9bc6e", size = 789545, upload-time = "2024-11-06T20:11:03.933Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/dc/96/53770115e507081122beca8899ab7f5ae28ae790bfcc82b5e38976df6a77/regex-2024.11.6-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f9d1e379028e0fc2ae3654bac3cbbef81bf3fd571272a42d56c24007979bafb6", size = 853045, upload-time = "2024-11-06T20:11:06.497Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/31/d3/1372add5251cc2d44b451bd94f43b2ec78e15a6e82bff6a290ef9fd8f00a/regex-2024.11.6-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:13291b39131e2d002a7940fb176e120bec5145f3aeb7621be6534e46251912c4", size = 860182, upload-time = "2024-11-06T20:11:09.06Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ed/e3/c446a64984ea9f69982ba1a69d4658d5014bc7a0ea468a07e1a1265db6e2/regex-2024.11.6-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4f51f88c126370dcec4908576c5a627220da6c09d0bff31cfa89f2523843316d", size = 787733, upload-time = "2024-11-06T20:11:11.256Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/f1/e40c8373e3480e4f29f2692bd21b3e05f296d3afebc7e5dcf21b9756ca1c/regex-2024.11.6-cp313-cp313-win32.whl", hash = "sha256:63b13cfd72e9601125027202cad74995ab26921d8cd935c25f09c630436348ff", size = 262122, upload-time = "2024-11-06T20:11:13.161Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/45/94/bc295babb3062a731f52621cdc992d123111282e291abaf23faa413443ea/regex-2024.11.6-cp313-cp313-win_amd64.whl", hash = "sha256:2b3361af3198667e99927da8b84c1b010752fa4b1115ee30beaa332cabc3ef1a", size = 273545, upload-time = "2024-11-06T20:11:15Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/15/16/b709b2119975035169a25aa8e4940ca177b1a2e25e14f8d996d09130368e/regex-2025.7.34-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:c3c9740a77aeef3f5e3aaab92403946a8d34437db930a0280e7e81ddcada61f5", size = 485334, upload-time = "2025-07-31T00:19:56.58Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/94/a6/c09136046be0595f0331bc58a0e5f89c2d324cf734e0b0ec53cf4b12a636/regex-2025.7.34-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:69ed3bc611540f2ea70a4080f853741ec698be556b1df404599f8724690edbcd", size = 289942, upload-time = "2025-07-31T00:19:57.943Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/36/91/08fc0fd0f40bdfb0e0df4134ee37cfb16e66a1044ac56d36911fd01c69d2/regex-2025.7.34-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:d03c6f9dcd562c56527c42b8530aad93193e0b3254a588be1f2ed378cdfdea1b", size = 285991, upload-time = "2025-07-31T00:19:59.837Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/be/2f/99dc8f6f756606f0c214d14c7b6c17270b6bbe26d5c1f05cde9dbb1c551f/regex-2025.7.34-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6164b1d99dee1dfad33f301f174d8139d4368a9fb50bf0a3603b2eaf579963ad", size = 797415, upload-time = "2025-07-31T00:20:01.668Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/62/cf/2fcdca1110495458ba4e95c52ce73b361cf1cafd8a53b5c31542cde9a15b/regex-2025.7.34-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:1e4f4f62599b8142362f164ce776f19d79bdd21273e86920a7b604a4275b4f59", size = 862487, upload-time = "2025-07-31T00:20:03.142Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/90/38/899105dd27fed394e3fae45607c1983e138273ec167e47882fc401f112b9/regex-2025.7.34-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:72a26dcc6a59c057b292f39d41465d8233a10fd69121fa24f8f43ec6294e5415", size = 910717, upload-time = "2025-07-31T00:20:04.727Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ee/f6/4716198dbd0bcc9c45625ac4c81a435d1c4d8ad662e8576dac06bab35b17/regex-2025.7.34-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d5273fddf7a3e602695c92716c420c377599ed3c853ea669c1fe26218867002f", size = 801943, upload-time = "2025-07-31T00:20:07.1Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/40/5d/cff8896d27e4e3dd11dd72ac78797c7987eb50fe4debc2c0f2f1682eb06d/regex-2025.7.34-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:c1844be23cd40135b3a5a4dd298e1e0c0cb36757364dd6cdc6025770363e06c1", size = 786664, upload-time = "2025-07-31T00:20:08.818Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/10/29/758bf83cf7b4c34f07ac3423ea03cee3eb3176941641e4ccc05620f6c0b8/regex-2025.7.34-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:dde35e2afbbe2272f8abee3b9fe6772d9b5a07d82607b5788e8508974059925c", size = 856457, upload-time = "2025-07-31T00:20:10.328Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d7/30/c19d212b619963c5b460bfed0ea69a092c6a43cba52a973d46c27b3e2975/regex-2025.7.34-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:f3f6e8e7af516a7549412ce57613e859c3be27d55341a894aacaa11703a4c31a", size = 849008, upload-time = "2025-07-31T00:20:11.823Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9e/b8/3c35da3b12c87e3cc00010ef6c3a4ae787cff0bc381aa3d251def219969a/regex-2025.7.34-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:469142fb94a869beb25b5f18ea87646d21def10fbacb0bcb749224f3509476f0", size = 788101, upload-time = "2025-07-31T00:20:13.729Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/47/80/2f46677c0b3c2b723b2c358d19f9346e714113865da0f5f736ca1a883bde/regex-2025.7.34-cp313-cp313-win32.whl", hash = "sha256:da7507d083ee33ccea1310447410c27ca11fb9ef18c95899ca57ff60a7e4d8f1", size = 264401, upload-time = "2025-07-31T00:20:15.233Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/be/fa/917d64dd074682606a003cba33585c28138c77d848ef72fc77cbb1183849/regex-2025.7.34-cp313-cp313-win_amd64.whl", hash = "sha256:9d644de5520441e5f7e2db63aec2748948cc39ed4d7a87fd5db578ea4043d997", size = 275368, upload-time = "2025-07-31T00:20:16.711Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/65/cd/f94383666704170a2154a5df7b16be28f0c27a266bffcd843e58bc84120f/regex-2025.7.34-cp313-cp313-win_arm64.whl", hash = "sha256:7bf1c5503a9f2cbd2f52d7e260acb3131b07b6273c470abb78568174fe6bde3f", size = 268482, upload-time = "2025-07-31T00:20:18.189Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ac/23/6376f3a23cf2f3c00514b1cdd8c990afb4dfbac3cb4a68b633c6b7e2e307/regex-2025.7.34-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:8283afe7042d8270cecf27cca558873168e771183d4d593e3c5fe5f12402212a", size = 485385, upload-time = "2025-07-31T00:20:19.692Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/73/5b/6d4d3a0b4d312adbfd6d5694c8dddcf1396708976dd87e4d00af439d962b/regex-2025.7.34-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:6c053f9647e3421dd2f5dff8172eb7b4eec129df9d1d2f7133a4386319b47435", size = 289788, upload-time = "2025-07-31T00:20:21.941Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/92/71/5862ac9913746e5054d01cb9fb8125b3d0802c0706ef547cae1e7f4428fa/regex-2025.7.34-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:a16dd56bbcb7d10e62861c3cd000290ddff28ea142ffb5eb3470f183628011ac", size = 286136, upload-time = "2025-07-31T00:20:26.146Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/27/df/5b505dc447eb71278eba10d5ec940769ca89c1af70f0468bfbcb98035dc2/regex-2025.7.34-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:69c593ff5a24c0d5c1112b0df9b09eae42b33c014bdca7022d6523b210b69f72", size = 797753, upload-time = "2025-07-31T00:20:27.919Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/86/38/3e3dc953d13998fa047e9a2414b556201dbd7147034fbac129392363253b/regex-2025.7.34-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:98d0ce170fcde1a03b5df19c5650db22ab58af375aaa6ff07978a85c9f250f0e", size = 863263, upload-time = "2025-07-31T00:20:29.803Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/68/e5/3ff66b29dde12f5b874dda2d9dec7245c2051f2528d8c2a797901497f140/regex-2025.7.34-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d72765a4bff8c43711d5b0f5b452991a9947853dfa471972169b3cc0ba1d0751", size = 910103, upload-time = "2025-07-31T00:20:31.313Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9e/fe/14176f2182125977fba3711adea73f472a11f3f9288c1317c59cd16ad5e6/regex-2025.7.34-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4494f8fd95a77eb434039ad8460e64d57baa0434f1395b7da44015bef650d0e4", size = 801709, upload-time = "2025-07-31T00:20:33.323Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5a/0d/80d4e66ed24f1ba876a9e8e31b709f9fd22d5c266bf5f3ab3c1afe683d7d/regex-2025.7.34-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:4f42b522259c66e918a0121a12429b2abcf696c6f967fa37bdc7b72e61469f98", size = 786726, upload-time = "2025-07-31T00:20:35.252Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/12/75/c3ebb30e04a56c046f5c85179dc173818551037daae2c0c940c7b19152cb/regex-2025.7.34-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:aaef1f056d96a0a5d53ad47d019d5b4c66fe4be2da87016e0d43b7242599ffc7", size = 857306, upload-time = "2025-07-31T00:20:37.12Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b1/b2/a4dc5d8b14f90924f27f0ac4c4c4f5e195b723be98adecc884f6716614b6/regex-2025.7.34-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:656433e5b7dccc9bc0da6312da8eb897b81f5e560321ec413500e5367fcd5d47", size = 848494, upload-time = "2025-07-31T00:20:38.818Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0d/21/9ac6e07a4c5e8646a90b56b61f7e9dac11ae0747c857f91d3d2bc7c241d9/regex-2025.7.34-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:e91eb2c62c39705e17b4d42d4b86c4e86c884c0d15d9c5a47d0835f8387add8e", size = 787850, upload-time = "2025-07-31T00:20:40.478Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/be/6c/d51204e28e7bc54f9a03bb799b04730d7e54ff2718862b8d4e09e7110a6a/regex-2025.7.34-cp314-cp314-win32.whl", hash = "sha256:f978ddfb6216028c8f1d6b0f7ef779949498b64117fc35a939022f67f810bdcb", size = 269730, upload-time = "2025-07-31T00:20:42.253Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/74/52/a7e92d02fa1fdef59d113098cb9f02c5d03289a0e9f9e5d4d6acccd10677/regex-2025.7.34-cp314-cp314-win_amd64.whl", hash = "sha256:4b7dc33b9b48fb37ead12ffc7bdb846ac72f99a80373c4da48f64b373a7abeae", size = 278640, upload-time = "2025-07-31T00:20:44.42Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d1/78/a815529b559b1771080faa90c3ab401730661f99d495ab0071649f139ebd/regex-2025.7.34-cp314-cp314-win_arm64.whl", hash = "sha256:4b8c4d39f451e64809912c82392933d80fe2e4a87eeef8859fcc5380d0173c64", size = 271757, upload-time = "2025-07-31T00:20:46.355Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
@ -665,56 +776,74 @@ sdist = { url = "https://files.pythonhosted.org/packages/0b/0f/b7d5d4b36553731f1
|
|||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.12.2"
|
||||
version = "0.12.7"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/6c/3d/d9a195676f25d00dbfcf3cf95fdd4c685c497fcfa7e862a44ac5e4e96480/ruff-0.12.2.tar.gz", hash = "sha256:d7b4f55cd6f325cb7621244f19c873c565a08aff5a4ba9c69aa7355f3f7afd3e", size = 4432239, upload-time = "2025-07-03T16:40:19.566Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/a1/81/0bd3594fa0f690466e41bd033bdcdf86cba8288345ac77ad4afbe5ec743a/ruff-0.12.7.tar.gz", hash = "sha256:1fc3193f238bc2d7968772c82831a4ff69252f673be371fb49663f0068b7ec71", size = 5197814, upload-time = "2025-07-29T22:32:35.877Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/74/b6/2098d0126d2d3318fd5bec3ad40d06c25d377d95749f7a0c5af17129b3b1/ruff-0.12.2-py3-none-linux_armv6l.whl", hash = "sha256:093ea2b221df1d2b8e7ad92fc6ffdca40a2cb10d8564477a987b44fd4008a7be", size = 10369761, upload-time = "2025-07-03T16:39:38.847Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b1/4b/5da0142033dbe155dc598cfb99262d8ee2449d76920ea92c4eeb9547c208/ruff-0.12.2-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:09e4cf27cc10f96b1708100fa851e0daf21767e9709e1649175355280e0d950e", size = 11155659, upload-time = "2025-07-03T16:39:42.294Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3e/21/967b82550a503d7c5c5c127d11c935344b35e8c521f52915fc858fb3e473/ruff-0.12.2-py3-none-macosx_11_0_arm64.whl", hash = "sha256:8ae64755b22f4ff85e9c52d1f82644abd0b6b6b6deedceb74bd71f35c24044cc", size = 10537769, upload-time = "2025-07-03T16:39:44.75Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/33/91/00cff7102e2ec71a4890fb7ba1803f2cdb122d82787c7d7cf8041fe8cbc1/ruff-0.12.2-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3eb3a6b2db4d6e2c77e682f0b988d4d61aff06860158fdb413118ca133d57922", size = 10717602, upload-time = "2025-07-03T16:39:47.652Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9b/eb/928814daec4e1ba9115858adcda44a637fb9010618721937491e4e2283b8/ruff-0.12.2-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:73448de992d05517170fc37169cbca857dfeaeaa8c2b9be494d7bcb0d36c8f4b", size = 10198772, upload-time = "2025-07-03T16:39:49.641Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/50/fa/f15089bc20c40f4f72334f9145dde55ab2b680e51afb3b55422effbf2fb6/ruff-0.12.2-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3b8b94317cbc2ae4a2771af641739f933934b03555e51515e6e021c64441532d", size = 11845173, upload-time = "2025-07-03T16:39:52.069Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/43/9f/1f6f98f39f2b9302acc161a4a2187b1e3a97634fe918a8e731e591841cf4/ruff-0.12.2-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:45fc42c3bf1d30d2008023a0a9a0cfb06bf9835b147f11fe0679f21ae86d34b1", size = 12553002, upload-time = "2025-07-03T16:39:54.551Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d8/70/08991ac46e38ddd231c8f4fd05ef189b1b94be8883e8c0c146a025c20a19/ruff-0.12.2-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ce48f675c394c37e958bf229fb5c1e843e20945a6d962cf3ea20b7a107dcd9f4", size = 12171330, upload-time = "2025-07-03T16:39:57.55Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/88/a9/5a55266fec474acfd0a1c73285f19dd22461d95a538f29bba02edd07a5d9/ruff-0.12.2-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:793d8859445ea47591272021a81391350205a4af65a9392401f418a95dfb75c9", size = 11774717, upload-time = "2025-07-03T16:39:59.78Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/87/e5/0c270e458fc73c46c0d0f7cf970bb14786e5fdb88c87b5e423a4bd65232b/ruff-0.12.2-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6932323db80484dda89153da3d8e58164d01d6da86857c79f1961934354992da", size = 11646659, upload-time = "2025-07-03T16:40:01.934Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b7/b6/45ab96070c9752af37f0be364d849ed70e9ccede07675b0ec4e3ef76b63b/ruff-0.12.2-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:6aa7e623a3a11538108f61e859ebf016c4f14a7e6e4eba1980190cacb57714ce", size = 10604012, upload-time = "2025-07-03T16:40:04.363Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/86/91/26a6e6a424eb147cc7627eebae095cfa0b4b337a7c1c413c447c9ebb72fd/ruff-0.12.2-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:2a4a20aeed74671b2def096bdf2eac610c7d8ffcbf4fb0e627c06947a1d7078d", size = 10176799, upload-time = "2025-07-03T16:40:06.514Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f5/0c/9f344583465a61c8918a7cda604226e77b2c548daf8ef7c2bfccf2b37200/ruff-0.12.2-py3-none-musllinux_1_2_i686.whl", hash = "sha256:71a4c550195612f486c9d1f2b045a600aeba851b298c667807ae933478fcef04", size = 11241507, upload-time = "2025-07-03T16:40:08.708Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1c/b7/99c34ded8fb5f86c0280278fa89a0066c3760edc326e935ce0b1550d315d/ruff-0.12.2-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:4987b8f4ceadf597c927beee65a5eaf994c6e2b631df963f86d8ad1bdea99342", size = 11717609, upload-time = "2025-07-03T16:40:10.836Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/51/de/8589fa724590faa057e5a6d171e7f2f6cffe3287406ef40e49c682c07d89/ruff-0.12.2-py3-none-win32.whl", hash = "sha256:369ffb69b70cd55b6c3fc453b9492d98aed98062db9fec828cdfd069555f5f1a", size = 10523823, upload-time = "2025-07-03T16:40:13.203Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/94/47/8abf129102ae4c90cba0c2199a1a9b0fa896f6f806238d6f8c14448cc748/ruff-0.12.2-py3-none-win_amd64.whl", hash = "sha256:dca8a3b6d6dc9810ed8f328d406516bf4d660c00caeaef36eb831cf4871b0639", size = 11629831, upload-time = "2025-07-03T16:40:15.478Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e2/1f/72d2946e3cc7456bb837e88000eb3437e55f80db339c840c04015a11115d/ruff-0.12.2-py3-none-win_arm64.whl", hash = "sha256:48d6c6bfb4761df68bc05ae630e24f506755e702d4fb08f08460be778c7ccb12", size = 10735334, upload-time = "2025-07-03T16:40:17.677Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e1/d2/6cb35e9c85e7a91e8d22ab32ae07ac39cc34a71f1009a6f9e4a2a019e602/ruff-0.12.7-py3-none-linux_armv6l.whl", hash = "sha256:76e4f31529899b8c434c3c1dede98c4483b89590e15fb49f2d46183801565303", size = 11852189, upload-time = "2025-07-29T22:31:41.281Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/63/5b/a4136b9921aa84638f1a6be7fb086f8cad0fde538ba76bda3682f2599a2f/ruff-0.12.7-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:789b7a03e72507c54fb3ba6209e4bb36517b90f1a3569ea17084e3fd295500fb", size = 12519389, upload-time = "2025-07-29T22:31:54.265Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a8/c9/3e24a8472484269b6b1821794141f879c54645a111ded4b6f58f9ab0705f/ruff-0.12.7-py3-none-macosx_11_0_arm64.whl", hash = "sha256:2e1c2a3b8626339bb6369116e7030a4cf194ea48f49b64bb505732a7fce4f4e3", size = 11743384, upload-time = "2025-07-29T22:31:59.575Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/26/7c/458dd25deeb3452c43eaee853c0b17a1e84169f8021a26d500ead77964fd/ruff-0.12.7-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:32dec41817623d388e645612ec70d5757a6d9c035f3744a52c7b195a57e03860", size = 11943759, upload-time = "2025-07-29T22:32:01.95Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7f/8b/658798472ef260ca050e400ab96ef7e85c366c39cf3dfbef4d0a46a528b6/ruff-0.12.7-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:47ef751f722053a5df5fa48d412dbb54d41ab9b17875c6840a58ec63ff0c247c", size = 11654028, upload-time = "2025-07-29T22:32:04.367Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a8/86/9c2336f13b2a3326d06d39178fd3448dcc7025f82514d1b15816fe42bfe8/ruff-0.12.7-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a828a5fc25a3efd3e1ff7b241fd392686c9386f20e5ac90aa9234a5faa12c423", size = 13225209, upload-time = "2025-07-29T22:32:06.952Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/76/69/df73f65f53d6c463b19b6b312fd2391dc36425d926ec237a7ed028a90fc1/ruff-0.12.7-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:5726f59b171111fa6a69d82aef48f00b56598b03a22f0f4170664ff4d8298efb", size = 14182353, upload-time = "2025-07-29T22:32:10.053Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/58/1e/de6cda406d99fea84b66811c189b5ea139814b98125b052424b55d28a41c/ruff-0.12.7-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:74e6f5c04c4dd4aba223f4fe6e7104f79e0eebf7d307e4f9b18c18362124bccd", size = 13631555, upload-time = "2025-07-29T22:32:12.644Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6f/ae/625d46d5164a6cc9261945a5e89df24457dc8262539ace3ac36c40f0b51e/ruff-0.12.7-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5d0bfe4e77fba61bf2ccadf8cf005d6133e3ce08793bbe870dd1c734f2699a3e", size = 12667556, upload-time = "2025-07-29T22:32:15.312Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/55/bf/9cb1ea5e3066779e42ade8d0cd3d3b0582a5720a814ae1586f85014656b6/ruff-0.12.7-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:06bfb01e1623bf7f59ea749a841da56f8f653d641bfd046edee32ede7ff6c606", size = 12939784, upload-time = "2025-07-29T22:32:17.69Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/55/7f/7ead2663be5627c04be83754c4f3096603bf5e99ed856c7cd29618c691bd/ruff-0.12.7-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:e41df94a957d50083fd09b916d6e89e497246698c3f3d5c681c8b3e7b9bb4ac8", size = 11771356, upload-time = "2025-07-29T22:32:20.134Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/17/40/a95352ea16edf78cd3a938085dccc55df692a4d8ba1b3af7accbe2c806b0/ruff-0.12.7-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:4000623300563c709458d0ce170c3d0d788c23a058912f28bbadc6f905d67afa", size = 11612124, upload-time = "2025-07-29T22:32:22.645Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4d/74/633b04871c669e23b8917877e812376827c06df866e1677f15abfadc95cb/ruff-0.12.7-py3-none-musllinux_1_2_i686.whl", hash = "sha256:69ffe0e5f9b2cf2b8e289a3f8945b402a1b19eff24ec389f45f23c42a3dd6fb5", size = 12479945, upload-time = "2025-07-29T22:32:24.765Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/be/34/c3ef2d7799c9778b835a76189c6f53c179d3bdebc8c65288c29032e03613/ruff-0.12.7-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:a07a5c8ffa2611a52732bdc67bf88e243abd84fe2d7f6daef3826b59abbfeda4", size = 12998677, upload-time = "2025-07-29T22:32:27.022Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/77/ab/aca2e756ad7b09b3d662a41773f3edcbd262872a4fc81f920dc1ffa44541/ruff-0.12.7-py3-none-win32.whl", hash = "sha256:c928f1b2ec59fb77dfdf70e0419408898b63998789cc98197e15f560b9e77f77", size = 11756687, upload-time = "2025-07-29T22:32:29.381Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b4/71/26d45a5042bc71db22ddd8252ca9d01e9ca454f230e2996bb04f16d72799/ruff-0.12.7-py3-none-win_amd64.whl", hash = "sha256:9c18f3d707ee9edf89da76131956aba1270c6348bfee8f6c647de841eac7194f", size = 12912365, upload-time = "2025-07-29T22:32:31.517Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4c/9b/0b8aa09817b63e78d94b4977f18b1fcaead3165a5ee49251c5d5c245bb2d/ruff-0.12.7-py3-none-win_arm64.whl", hash = "sha256:dfce05101dbd11833a0776716d5d1578641b7fddb537fe7fa956ab85d1769b69", size = 11982083, upload-time = "2025-07-29T22:32:33.881Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "scipy"
|
||||
version = "1.16.0"
|
||||
version = "1.16.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "numpy" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/81/18/b06a83f0c5ee8cddbde5e3f3d0bb9b702abfa5136ef6d4620ff67df7eee5/scipy-1.16.0.tar.gz", hash = "sha256:b5ef54021e832869c8cfb03bc3bf20366cbcd426e02a58e8a58d7584dfbb8f62", size = 30581216, upload-time = "2025-06-22T16:27:55.782Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/f5/4a/b927028464795439faec8eaf0b03b011005c487bb2d07409f28bf30879c4/scipy-1.16.1.tar.gz", hash = "sha256:44c76f9e8b6e8e488a586190ab38016e4ed2f8a038af7cd3defa903c0a2238b3", size = 30580861, upload-time = "2025-07-27T16:33:30.834Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/46/95/0746417bc24be0c2a7b7563946d61f670a3b491b76adede420e9d173841f/scipy-1.16.0-cp313-cp313-macosx_10_14_x86_64.whl", hash = "sha256:e9f414cbe9ca289a73e0cc92e33a6a791469b6619c240aa32ee18abdce8ab451", size = 36418162, upload-time = "2025-06-22T16:19:56.3Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/19/5a/914355a74481b8e4bbccf67259bbde171348a3f160b67b4945fbc5f5c1e5/scipy-1.16.0-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:bbba55fb97ba3cdef9b1ee973f06b09d518c0c7c66a009c729c7d1592be1935e", size = 28465985, upload-time = "2025-06-22T16:20:01.238Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/58/46/63477fc1246063855969cbefdcee8c648ba4b17f67370bd542ba56368d0b/scipy-1.16.0-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:58e0d4354eacb6004e7aa1cd350e5514bd0270acaa8d5b36c0627bb3bb486974", size = 20737961, upload-time = "2025-06-22T16:20:05.913Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/93/86/0fbb5588b73555e40f9d3d6dde24ee6fac7d8e301a27f6f0cab9d8f66ff2/scipy-1.16.0-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:75b2094ec975c80efc273567436e16bb794660509c12c6a31eb5c195cbf4b6dc", size = 23377941, upload-time = "2025-06-22T16:20:10.668Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ca/80/a561f2bf4c2da89fa631b3cbf31d120e21ea95db71fd9ec00cb0247c7a93/scipy-1.16.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:6b65d232157a380fdd11a560e7e21cde34fdb69d65c09cb87f6cc024ee376351", size = 33196703, upload-time = "2025-06-22T16:20:16.097Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/11/6b/3443abcd0707d52e48eb315e33cc669a95e29fc102229919646f5a501171/scipy-1.16.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:1d8747f7736accd39289943f7fe53a8333be7f15a82eea08e4afe47d79568c32", size = 35083410, upload-time = "2025-06-22T16:20:21.734Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/20/ab/eb0fc00e1e48961f1bd69b7ad7e7266896fe5bad4ead91b5fc6b3561bba4/scipy-1.16.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:eb9f147a1b8529bb7fec2a85cf4cf42bdfadf9e83535c309a11fdae598c88e8b", size = 35387829, upload-time = "2025-06-22T16:20:27.548Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/57/9e/d6fc64e41fad5d481c029ee5a49eefc17f0b8071d636a02ceee44d4a0de2/scipy-1.16.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:d2b83c37edbfa837a8923d19c749c1935ad3d41cf196006a24ed44dba2ec4358", size = 37841356, upload-time = "2025-06-22T16:20:35.112Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7c/a7/4c94bbe91f12126b8bf6709b2471900577b7373a4fd1f431f28ba6f81115/scipy-1.16.0-cp313-cp313-win_amd64.whl", hash = "sha256:79a3c13d43c95aa80b87328a46031cf52508cf5f4df2767602c984ed1d3c6bbe", size = 38403710, upload-time = "2025-06-22T16:21:54.473Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/47/20/965da8497f6226e8fa90ad3447b82ed0e28d942532e92dd8b91b43f100d4/scipy-1.16.0-cp313-cp313t-macosx_10_14_x86_64.whl", hash = "sha256:f91b87e1689f0370690e8470916fe1b2308e5b2061317ff76977c8f836452a47", size = 36813833, upload-time = "2025-06-22T16:20:43.925Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/28/f4/197580c3dac2d234e948806e164601c2df6f0078ed9f5ad4a62685b7c331/scipy-1.16.0-cp313-cp313t-macosx_12_0_arm64.whl", hash = "sha256:88a6ca658fb94640079e7a50b2ad3b67e33ef0f40e70bdb7dc22017dae73ac08", size = 28974431, upload-time = "2025-06-22T16:20:51.302Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8a/fc/e18b8550048d9224426e76906694c60028dbdb65d28b1372b5503914b89d/scipy-1.16.0-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:ae902626972f1bd7e4e86f58fd72322d7f4ec7b0cfc17b15d4b7006efc385176", size = 21246454, upload-time = "2025-06-22T16:20:57.276Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8c/48/07b97d167e0d6a324bfd7484cd0c209cc27338b67e5deadae578cf48e809/scipy-1.16.0-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:8cb824c1fc75ef29893bc32b3ddd7b11cf9ab13c1127fe26413a05953b8c32ed", size = 23772979, upload-time = "2025-06-22T16:21:03.363Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4c/4f/9efbd3f70baf9582edf271db3002b7882c875ddd37dc97f0f675ad68679f/scipy-1.16.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:de2db7250ff6514366a9709c2cba35cb6d08498e961cba20d7cff98a7ee88938", size = 33341972, upload-time = "2025-06-22T16:21:11.14Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3f/dc/9e496a3c5dbe24e76ee24525155ab7f659c20180bab058ef2c5fa7d9119c/scipy-1.16.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:e85800274edf4db8dd2e4e93034f92d1b05c9421220e7ded9988b16976f849c1", size = 35185476, upload-time = "2025-06-22T16:21:19.156Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ce/b3/21001cff985a122ba434c33f2c9d7d1dc3b669827e94f4fc4e1fe8b9dfd8/scipy-1.16.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:4f720300a3024c237ace1cb11f9a84c38beb19616ba7c4cdcd771047a10a1706", size = 35570990, upload-time = "2025-06-22T16:21:27.797Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e5/d3/7ba42647d6709251cdf97043d0c107e0317e152fa2f76873b656b509ff55/scipy-1.16.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:aad603e9339ddb676409b104c48a027e9916ce0d2838830691f39552b38a352e", size = 37950262, upload-time = "2025-06-22T16:21:36.976Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/eb/c4/231cac7a8385394ebbbb4f1ca662203e9d8c332825ab4f36ffc3ead09a42/scipy-1.16.0-cp313-cp313t-win_amd64.whl", hash = "sha256:f56296fefca67ba605fd74d12f7bd23636267731a72cb3947963e76b8c0a25db", size = 38515076, upload-time = "2025-06-22T16:21:45.694Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/93/0b/b5c99382b839854a71ca9482c684e3472badc62620287cbbdab499b75ce6/scipy-1.16.1-cp313-cp313-macosx_10_14_x86_64.whl", hash = "sha256:5451606823a5e73dfa621a89948096c6528e2896e40b39248295d3a0138d594f", size = 36533717, upload-time = "2025-07-27T16:28:51.706Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/eb/e5/69ab2771062c91e23e07c12e7d5033a6b9b80b0903ee709c3c36b3eb520c/scipy-1.16.1-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:89728678c5ca5abd610aee148c199ac1afb16e19844401ca97d43dc548a354eb", size = 28570009, upload-time = "2025-07-27T16:28:57.017Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f4/69/bd75dbfdd3cf524f4d753484d723594aed62cfaac510123e91a6686d520b/scipy-1.16.1-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:e756d688cb03fd07de0fffad475649b03cb89bee696c98ce508b17c11a03f95c", size = 20841942, upload-time = "2025-07-27T16:29:01.152Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ea/74/add181c87663f178ba7d6144b370243a87af8476664d5435e57d599e6874/scipy-1.16.1-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:5aa2687b9935da3ed89c5dbed5234576589dd28d0bf7cd237501ccfbdf1ad608", size = 23498507, upload-time = "2025-07-27T16:29:05.202Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1d/74/ece2e582a0d9550cee33e2e416cc96737dce423a994d12bbe59716f47ff1/scipy-1.16.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:0851f6a1e537fe9399f35986897e395a1aa61c574b178c0d456be5b1a0f5ca1f", size = 33286040, upload-time = "2025-07-27T16:29:10.201Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e4/82/08e4076df538fb56caa1d489588d880ec7c52d8273a606bb54d660528f7c/scipy-1.16.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fedc2cbd1baed37474b1924c331b97bdff611d762c196fac1a9b71e67b813b1b", size = 35176096, upload-time = "2025-07-27T16:29:17.091Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fa/79/cd710aab8c921375711a8321c6be696e705a120e3011a643efbbcdeeabcc/scipy-1.16.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:2ef500e72f9623a6735769e4b93e9dcb158d40752cdbb077f305487e3e2d1f45", size = 35490328, upload-time = "2025-07-27T16:29:22.928Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/71/73/e9cc3d35ee4526d784520d4494a3e1ca969b071fb5ae5910c036a375ceec/scipy-1.16.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:978d8311674b05a8f7ff2ea6c6bce5d8b45a0cb09d4c5793e0318f448613ea65", size = 37939921, upload-time = "2025-07-27T16:29:29.108Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/21/12/c0efd2941f01940119b5305c375ae5c0fcb7ec193f806bd8f158b73a1782/scipy-1.16.1-cp313-cp313-win_amd64.whl", hash = "sha256:81929ed0fa7a5713fcdd8b2e6f73697d3b4c4816d090dd34ff937c20fa90e8ab", size = 38479462, upload-time = "2025-07-27T16:30:24.078Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7a/19/c3d08b675260046a991040e1ea5d65f91f40c7df1045fffff412dcfc6765/scipy-1.16.1-cp313-cp313t-macosx_10_14_x86_64.whl", hash = "sha256:bcc12db731858abda693cecdb3bdc9e6d4bd200213f49d224fe22df82687bdd6", size = 36938832, upload-time = "2025-07-27T16:29:35.057Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/81/f2/ce53db652c033a414a5b34598dba6b95f3d38153a2417c5a3883da429029/scipy-1.16.1-cp313-cp313t-macosx_12_0_arm64.whl", hash = "sha256:744d977daa4becb9fc59135e75c069f8d301a87d64f88f1e602a9ecf51e77b27", size = 29093084, upload-time = "2025-07-27T16:29:40.201Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a9/ae/7a10ff04a7dc15f9057d05b33737ade244e4bd195caa3f7cc04d77b9e214/scipy-1.16.1-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:dc54f76ac18073bcecffb98d93f03ed6b81a92ef91b5d3b135dcc81d55a724c7", size = 21365098, upload-time = "2025-07-27T16:29:44.295Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/36/ac/029ff710959932ad3c2a98721b20b405f05f752f07344622fd61a47c5197/scipy-1.16.1-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:367d567ee9fc1e9e2047d31f39d9d6a7a04e0710c86e701e053f237d14a9b4f6", size = 23896858, upload-time = "2025-07-27T16:29:48.784Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/71/13/d1ef77b6bd7898720e1f0b6b3743cb945f6c3cafa7718eaac8841035ab60/scipy-1.16.1-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:4cf5785e44e19dcd32a0e4807555e1e9a9b8d475c6afff3d21c3c543a6aa84f4", size = 33438311, upload-time = "2025-07-27T16:29:54.164Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2d/e0/e64a6821ffbb00b4c5b05169f1c1fddb4800e9307efe3db3788995a82a2c/scipy-1.16.1-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3d0b80fb26d3e13a794c71d4b837e2a589d839fd574a6bbb4ee1288c213ad4a3", size = 35279542, upload-time = "2025-07-27T16:30:00.249Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/57/59/0dc3c8b43e118f1e4ee2b798dcc96ac21bb20014e5f1f7a8e85cc0653bdb/scipy-1.16.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:8503517c44c18d1030d666cb70aaac1cc8913608816e06742498833b128488b7", size = 35667665, upload-time = "2025-07-27T16:30:05.916Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/45/5f/844ee26e34e2f3f9f8febb9343748e72daeaec64fe0c70e9bf1ff84ec955/scipy-1.16.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:30cc4bb81c41831ecfd6dc450baf48ffd80ef5aed0f5cf3ea775740e80f16ecc", size = 38045210, upload-time = "2025-07-27T16:30:11.655Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8d/d7/210f2b45290f444f1de64bc7353aa598ece9f0e90c384b4a156f9b1a5063/scipy-1.16.1-cp313-cp313t-win_amd64.whl", hash = "sha256:c24fa02f7ed23ae514460a22c57eca8f530dbfa50b1cfdbf4f37c05b5309cc39", size = 38593661, upload-time = "2025-07-27T16:30:17.825Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/81/ea/84d481a5237ed223bd3d32d6e82d7a6a96e34756492666c260cef16011d1/scipy-1.16.1-cp314-cp314-macosx_10_14_x86_64.whl", hash = "sha256:796a5a9ad36fa3a782375db8f4241ab02a091308eb079746bc0f874c9b998318", size = 36525921, upload-time = "2025-07-27T16:30:30.081Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4e/9f/d9edbdeff9f3a664807ae3aea383e10afaa247e8e6255e6d2aa4515e8863/scipy-1.16.1-cp314-cp314-macosx_12_0_arm64.whl", hash = "sha256:3ea0733a2ff73fd6fdc5fecca54ee9b459f4d74f00b99aced7d9a3adb43fb1cc", size = 28564152, upload-time = "2025-07-27T16:30:35.336Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3b/95/8125bcb1fe04bc267d103e76516243e8d5e11229e6b306bda1024a5423d1/scipy-1.16.1-cp314-cp314-macosx_14_0_arm64.whl", hash = "sha256:85764fb15a2ad994e708258bb4ed8290d1305c62a4e1ef07c414356a24fcfbf8", size = 20836028, upload-time = "2025-07-27T16:30:39.421Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/77/9c/bf92e215701fc70bbcd3d14d86337cf56a9b912a804b9c776a269524a9e9/scipy-1.16.1-cp314-cp314-macosx_14_0_x86_64.whl", hash = "sha256:ca66d980469cb623b1759bdd6e9fd97d4e33a9fad5b33771ced24d0cb24df67e", size = 23489666, upload-time = "2025-07-27T16:30:43.663Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5e/00/5e941d397d9adac41b02839011594620d54d99488d1be5be755c00cde9ee/scipy-1.16.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:e7cc1ffcc230f568549fc56670bcf3df1884c30bd652c5da8138199c8c76dae0", size = 33358318, upload-time = "2025-07-27T16:30:48.982Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0e/87/8db3aa10dde6e3e8e7eb0133f24baa011377d543f5b19c71469cf2648026/scipy-1.16.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3ddfb1e8d0b540cb4ee9c53fc3dea3186f97711248fb94b4142a1b27178d8b4b", size = 35185724, upload-time = "2025-07-27T16:30:54.26Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/89/b4/6ab9ae443216807622bcff02690262d8184078ea467efee2f8c93288a3b1/scipy-1.16.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:4dc0e7be79e95d8ba3435d193e0d8ce372f47f774cffd882f88ea4e1e1ddc731", size = 35554335, upload-time = "2025-07-27T16:30:59.765Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9c/9a/d0e9dc03c5269a1afb60661118296a32ed5d2c24298af61b676c11e05e56/scipy-1.16.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:f23634f9e5adb51b2a77766dac217063e764337fbc816aa8ad9aaebcd4397fd3", size = 37960310, upload-time = "2025-07-27T16:31:06.151Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5e/00/c8f3130a50521a7977874817ca89e0599b1b4ee8e938bad8ae798a0e1f0d/scipy-1.16.1-cp314-cp314-win_amd64.whl", hash = "sha256:57d75524cb1c5a374958a2eae3d84e1929bb971204cc9d52213fb8589183fc19", size = 39319239, upload-time = "2025-07-27T16:31:59.942Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f2/f2/1ca3eda54c3a7e4c92f6acef7db7b3a057deb135540d23aa6343ef8ad333/scipy-1.16.1-cp314-cp314t-macosx_10_14_x86_64.whl", hash = "sha256:d8da7c3dd67bcd93f15618938f43ed0995982eb38973023d46d4646c4283ad65", size = 36939460, upload-time = "2025-07-27T16:31:11.865Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/80/30/98c2840b293a132400c0940bb9e140171dcb8189588619048f42b2ce7b4f/scipy-1.16.1-cp314-cp314t-macosx_12_0_arm64.whl", hash = "sha256:cc1d2f2fd48ba1e0620554fe5bc44d3e8f5d4185c8c109c7fbdf5af2792cfad2", size = 29093322, upload-time = "2025-07-27T16:31:17.045Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c1/e6/1e6e006e850622cf2a039b62d1a6ddc4497d4851e58b68008526f04a9a00/scipy-1.16.1-cp314-cp314t-macosx_14_0_arm64.whl", hash = "sha256:21a611ced9275cb861bacadbada0b8c0623bc00b05b09eb97f23b370fc2ae56d", size = 21365329, upload-time = "2025-07-27T16:31:21.188Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8e/02/72a5aa5b820589dda9a25e329ca752842bfbbaf635e36bc7065a9b42216e/scipy-1.16.1-cp314-cp314t-macosx_14_0_x86_64.whl", hash = "sha256:8dfbb25dffc4c3dd9371d8ab456ca81beeaf6f9e1c2119f179392f0dc1ab7695", size = 23897544, upload-time = "2025-07-27T16:31:25.408Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/dc/7122d806a6f9eb8a33532982234bed91f90272e990f414f2830cfe656e0b/scipy-1.16.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:f0ebb7204f063fad87fc0a0e4ff4a2ff40b2a226e4ba1b7e34bf4b79bf97cd86", size = 33442112, upload-time = "2025-07-27T16:31:30.62Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/24/39/e383af23564daa1021a5b3afbe0d8d6a68ec639b943661841f44ac92de85/scipy-1.16.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f1b9e5962656f2734c2b285a8745358ecb4e4efbadd00208c80a389227ec61ff", size = 35286594, upload-time = "2025-07-27T16:31:36.112Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/95/47/1a0b0aff40c3056d955f38b0df5d178350c3d74734ec54f9c68d23910be5/scipy-1.16.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:5e1a106f8c023d57a2a903e771228bf5c5b27b5d692088f457acacd3b54511e4", size = 35665080, upload-time = "2025-07-27T16:31:42.025Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/64/df/ce88803e9ed6e27fe9b9abefa157cf2c80e4fa527cf17ee14be41f790ad4/scipy-1.16.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:709559a1db68a9abc3b2c8672c4badf1614f3b440b3ab326d86a5c0491eafae3", size = 38050306, upload-time = "2025-07-27T16:31:48.109Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6e/6c/a76329897a7cae4937d403e623aa6aaea616a0bb5b36588f0b9d1c9a3739/scipy-1.16.1-cp314-cp314t-win_amd64.whl", hash = "sha256:c0c804d60492a0aad7f5b2bb1862f4548b990049e27e828391ff2bf6f7199998", size = 39427705, upload-time = "2025-07-27T16:31:53.96Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
@ -737,14 +866,14 @@ wheels = [
|
|||
|
||||
[[package]]
|
||||
name = "starlette"
|
||||
version = "0.46.2"
|
||||
version = "0.47.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "anyio" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/ce/20/08dfcd9c983f6a6f4a1000d934b9e6d626cff8d2eeb77a89a68eef20a2b7/starlette-0.46.2.tar.gz", hash = "sha256:7f7361f34eed179294600af672f565727419830b54b7b084efe44bb82d2fccd5", size = 2580846, upload-time = "2025-04-13T13:56:17.942Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/04/57/d062573f391d062710d4088fa1369428c38d51460ab6fedff920efef932e/starlette-0.47.2.tar.gz", hash = "sha256:6ae9aa5db235e4846decc1e7b79c4f346adf41e9777aebeb49dfd09bbd7023d8", size = 2583948, upload-time = "2025-07-20T17:31:58.522Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/8b/0c/9d30a4ebeb6db2b25a841afbb80f6ef9a854fc3b41be131d249a977b4959/starlette-0.46.2-py3-none-any.whl", hash = "sha256:595633ce89f8ffa71a015caed34a5b2dc1c0cdb3f0f1fbd1e69339cf2abeec35", size = 72037, upload-time = "2025-04-13T13:56:16.21Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f7/1f/b876b1f83aef204198a42dc101613fefccb32258e5428b5f9259677864b4/starlette-0.47.2-py3-none-any.whl", hash = "sha256:c5847e96134e5c5371ee9fac6fdf1a67336d5815e09eb2a01fdb57a351ef915b", size = 72984, upload-time = "2025-07-20T17:31:56.738Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
@ -774,6 +903,45 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/8a/b6/097367f180b6383a3581ca1b86fcae284e52075fa941d1232df35293363c/trafilatura-2.0.0-py3-none-any.whl", hash = "sha256:77eb5d1e993747f6f20938e1de2d840020719735690c840b9a1024803a4cd51d", size = 132557, upload-time = "2024-12-03T15:23:21.41Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "types-python-dateutil"
|
||||
version = "2.9.0.20250708"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/c9/95/6bdde7607da2e1e99ec1c1672a759d42f26644bbacf939916e086db34870/types_python_dateutil-2.9.0.20250708.tar.gz", hash = "sha256:ccdbd75dab2d6c9696c350579f34cffe2c281e4c5f27a585b2a2438dd1d5c8ab", size = 15834, upload-time = "2025-07-08T03:14:03.382Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/72/52/43e70a8e57fefb172c22a21000b03ebcc15e47e97f5cb8495b9c2832efb4/types_python_dateutil-2.9.0.20250708-py3-none-any.whl", hash = "sha256:4d6d0cc1cc4d24a2dc3816024e502564094497b713f7befda4d5bc7a8e3fd21f", size = 17724, upload-time = "2025-07-08T03:14:02.593Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "types-pytz"
|
||||
version = "2025.2.0.20250516"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/bd/72/b0e711fd90409f5a76c75349055d3eb19992c110f0d2d6aabbd6cfbc14bf/types_pytz-2025.2.0.20250516.tar.gz", hash = "sha256:e1216306f8c0d5da6dafd6492e72eb080c9a166171fa80dd7a1990fd8be7a7b3", size = 10940, upload-time = "2025-05-16T03:07:01.91Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/c1/ba/e205cd11c1c7183b23c97e4bcd1de7bc0633e2e867601c32ecfc6ad42675/types_pytz-2025.2.0.20250516-py3-none-any.whl", hash = "sha256:e0e0c8a57e2791c19f718ed99ab2ba623856b11620cb6b637e5f62ce285a7451", size = 10136, upload-time = "2025-05-16T03:07:01.075Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "types-pyyaml"
|
||||
version = "6.0.12.20250516"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/4e/22/59e2aeb48ceeee1f7cd4537db9568df80d62bdb44a7f9e743502ea8aab9c/types_pyyaml-6.0.12.20250516.tar.gz", hash = "sha256:9f21a70216fc0fa1b216a8176db5f9e0af6eb35d2f2932acb87689d03a5bf6ba", size = 17378, upload-time = "2025-05-16T03:08:04.897Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/99/5f/e0af6f7f6a260d9af67e1db4f54d732abad514252a7a378a6c4d17dd1036/types_pyyaml-6.0.12.20250516-py3-none-any.whl", hash = "sha256:8478208feaeb53a34cb5d970c56a7cd76b72659442e733e268a94dc72b2d0530", size = 20312, upload-time = "2025-05-16T03:08:04.019Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "types-requests"
|
||||
version = "2.32.4.20250611"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "urllib3" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/6d/7f/73b3a04a53b0fd2a911d4ec517940ecd6600630b559e4505cc7b68beb5a0/types_requests-2.32.4.20250611.tar.gz", hash = "sha256:741c8777ed6425830bf51e54d6abe245f79b4dcb9019f1622b773463946bf826", size = 23118, upload-time = "2025-06-11T03:11:41.272Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/3d/ea/0be9258c5a4fa1ba2300111aa5a0767ee6d18eb3fd20e91616c12082284d/types_requests-2.32.4.20250611-py3-none-any.whl", hash = "sha256:ad2fe5d3b0cb3c2c902c8815a70e7fb2302c4b8c1f77bdcd738192cdb3878072", size = 20643, upload-time = "2025-06-11T03:11:40.186Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "typing-extensions"
|
||||
version = "4.14.1"
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue