Compare commits
1 Commits
main
...
b280d62ee5
| Author | SHA1 | Date | |
|---|---|---|---|
| b280d62ee5 |
153
.claude/backend-mqtt-alerts-prompt.md
Normal file
@@ -0,0 +1,153 @@
|
||||
# Backend Task: Subscribe to Vesper MQTT Alert Topics
|
||||
|
||||
> Use this document as a prompt / task brief for implementing the backend side
|
||||
> of the Vesper MQTT alert system. The firmware changes are complete.
|
||||
> Full topic spec: `docs/reference/mqtt-events.md`
|
||||
|
||||
---
|
||||
|
||||
## What the firmware now publishes
|
||||
|
||||
The Vesper firmware (v155+) publishes on three status topics:
|
||||
|
||||
### 1. `vesper/{device_id}/status/heartbeat` (unchanged)
|
||||
- Every 30 seconds, retained, QoS 1
|
||||
- You already handle this — **no change needed** except: suppress any log entry / display update triggered by heartbeat arrival. Update `last_seen` silently. Only surface an event when the device goes *silent* (no heartbeat for 90s).
|
||||
|
||||
### 2. `vesper/{device_id}/status/alerts` (NEW)
|
||||
- Published only when a subsystem state changes (HEALTHY → WARNING, WARNING → CRITICAL, etc.)
|
||||
- QoS 1, not retained
|
||||
- One message per state transition — not repeated until state changes again
|
||||
|
||||
**Alert payload:**
|
||||
```json
|
||||
{ "subsystem": "FileManager", "state": "WARNING", "msg": "ConfigManager health check failed" }
|
||||
```
|
||||
**Cleared payload (recovery):**
|
||||
```json
|
||||
{ "subsystem": "FileManager", "state": "CLEARED" }
|
||||
```
|
||||
|
||||
### 3. `vesper/{device_id}/status/info` (NEW)
|
||||
- Published on significant device state changes (playback start/stop, etc.)
|
||||
- QoS 0, not retained
|
||||
|
||||
```json
|
||||
{ "type": "playback_started", "payload": { "melody_uid": "ABC123" } }
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## What to implement in the backend (FastAPI + MQTT)
|
||||
|
||||
### Subscribe to new topics
|
||||
|
||||
Add to your MQTT subscription list:
|
||||
```python
|
||||
client.subscribe("vesper/+/status/alerts", qos=1)
|
||||
client.subscribe("vesper/+/status/info", qos=0)
|
||||
```
|
||||
|
||||
### Database model — active alerts per device
|
||||
|
||||
Create a table (or document) to store the current alert state per device:
|
||||
|
||||
```sql
|
||||
CREATE TABLE device_alerts (
|
||||
device_id TEXT NOT NULL,
|
||||
subsystem TEXT NOT NULL,
|
||||
state TEXT NOT NULL, -- WARNING | CRITICAL | FAILED
|
||||
message TEXT,
|
||||
updated_at TIMESTAMP NOT NULL,
|
||||
PRIMARY KEY (device_id, subsystem)
|
||||
);
|
||||
```
|
||||
|
||||
Or equivalent in your ORM / MongoDB / Redis structure.
|
||||
|
||||
### MQTT message handler — alerts topic
|
||||
|
||||
```python
|
||||
def on_alerts_message(device_id: str, payload: dict):
|
||||
subsystem = payload["subsystem"]
|
||||
state = payload["state"]
|
||||
message = payload.get("msg", "")
|
||||
|
||||
if state == "CLEARED":
|
||||
# Remove alert from active set
|
||||
db.device_alerts.delete(device_id=device_id, subsystem=subsystem)
|
||||
else:
|
||||
# Upsert — create or update
|
||||
db.device_alerts.upsert(
|
||||
device_id = device_id,
|
||||
subsystem = subsystem,
|
||||
state = state,
|
||||
message = message,
|
||||
updated_at = now()
|
||||
)
|
||||
|
||||
# Optionally push a WebSocket event to the console UI
|
||||
ws_broadcast(device_id, {"event": "alert_update", "subsystem": subsystem, "state": state})
|
||||
```
|
||||
|
||||
### MQTT message handler — info topic
|
||||
|
||||
```python
|
||||
def on_info_message(device_id: str, payload: dict):
|
||||
event_type = payload["type"]
|
||||
data = payload.get("payload", {})
|
||||
|
||||
# Store or forward as needed — e.g. update device playback state
|
||||
if event_type == "playback_started":
|
||||
db.devices.update(device_id, playback_active=True, melody_uid=data.get("melody_uid"))
|
||||
elif event_type == "playback_stopped":
|
||||
db.devices.update(device_id, playback_active=False, melody_uid=None)
|
||||
```
|
||||
|
||||
### API endpoint — get active alerts for a device
|
||||
|
||||
```
|
||||
GET /api/devices/{device_id}/alerts
|
||||
```
|
||||
|
||||
Returns the current active alert set (the upserted rows from the table above):
|
||||
|
||||
```json
|
||||
[
|
||||
{ "subsystem": "FileManager", "state": "WARNING", "message": "SD mount failed", "updated_at": "..." },
|
||||
{ "subsystem": "TimeKeeper", "state": "WARNING", "message": "NTP sync failed", "updated_at": "..." }
|
||||
]
|
||||
```
|
||||
|
||||
An empty array means the device is fully healthy (no active alerts).
|
||||
|
||||
### Console UI guidance
|
||||
|
||||
- Device list: show a coloured dot next to each device (green = no alerts, yellow = warnings, red = critical/failed). Update via WebSocket push.
|
||||
- Device detail page: show an "Active Alerts" section that renders the alert set statically. Do not render a scrolling alert log — just the current state.
|
||||
- When a `CLEARED` event arrives, remove the entry from the UI immediately.
|
||||
|
||||
---
|
||||
|
||||
## What NOT to do
|
||||
|
||||
- **Do not log every heartbeat** as a visible event. Heartbeats are internal housekeeping.
|
||||
- **Do not poll the device** for health status — the device pushes on change.
|
||||
- **Do not store alerts as an append-only log** — upsert by `(device_id, subsystem)`. The server holds the current state, not a history.
|
||||
|
||||
---
|
||||
|
||||
## Testing
|
||||
|
||||
1. Flash a device with firmware v155+
|
||||
2. Subscribe manually:
|
||||
```bash
|
||||
mosquitto_sub -h <broker> -t "vesper/+/status/alerts" -v
|
||||
mosquitto_sub -h <broker> -t "vesper/+/status/info" -v
|
||||
```
|
||||
3. Remove the SD card from the device — expect a `FileManager` `WARNING` alert within 5 minutes (next health check cycle), or trigger it immediately via:
|
||||
```json
|
||||
{ "v": 2, "cmd": "system.health" }
|
||||
```
|
||||
sent to `vesper/{device_id}/control`
|
||||
4. Reinsert the SD card — expect a `FileManager` `CLEARED` alert on the next health check
|
||||
243
.claude/crm-build-plan.md
Normal file
@@ -0,0 +1,243 @@
|
||||
# BellSystems CRM — Build Plan & Step Prompts
|
||||
|
||||
## Overview
|
||||
|
||||
A bespoke CRM module built directly into the existing BellSystems web console.
|
||||
Stack: FastAPI backend (Firestore), React + Vite frontend.
|
||||
No new auth — uses the existing JWT + permission system.
|
||||
No file storage on VPS — all media lives on Nextcloud via WebDAV.
|
||||
|
||||
---
|
||||
|
||||
## Architecture Summary
|
||||
|
||||
### Backend
|
||||
- New module: `backend/crm/` with `models.py`, `service.py`, `router.py`
|
||||
- Firestore collections: `crm_customers`, `crm_orders`, `crm_products`
|
||||
- SQLite (existing `mqtt_data.db`) for comms_log (high-write, queryable)
|
||||
- Router registered in `backend/main.py` as `/api/crm`
|
||||
|
||||
### Frontend
|
||||
- New section: `frontend/src/crm/`
|
||||
- Routes added to `frontend/src/App.jsx`
|
||||
- Nav entries added to `frontend/src/layout/Sidebar.jsx`
|
||||
|
||||
### Integrations (later steps)
|
||||
- Nextcloud: WebDAV via `httpx` in backend
|
||||
- Email: IMAP (read) + SMTP (send) via `imaplib` / `smtplib`
|
||||
- WhatsApp: Meta Cloud API webhook
|
||||
- FreePBX: Asterisk AMI socket listener
|
||||
|
||||
---
|
||||
|
||||
## Data Model Reference
|
||||
|
||||
### `crm_customers` (Firestore)
|
||||
```json
|
||||
{
|
||||
"id": "auto",
|
||||
"name": "Στέλιος Μπιμπης",
|
||||
"organization": "Ενορία Αγ. Παρασκευής",
|
||||
"contacts": [
|
||||
{ "type": "email", "label": "personal", "value": "...", "primary": true },
|
||||
{ "type": "phone", "label": "mobile", "value": "...", "primary": true }
|
||||
],
|
||||
"notes": [
|
||||
{ "text": "...", "by": "user_name", "at": "ISO datetime" }
|
||||
],
|
||||
"location": { "city": "", "country": "", "region": "" },
|
||||
"language": "el",
|
||||
"tags": [],
|
||||
"owned_items": [
|
||||
{ "type": "console_device", "device_id": "UID", "label": "..." },
|
||||
{ "type": "product", "product_id": "pid", "product_name": "...", "quantity": 1, "serial_numbers": [] },
|
||||
{ "type": "freetext", "description": "...", "serial_number": "", "notes": "" }
|
||||
],
|
||||
"linked_user_ids": [],
|
||||
"nextcloud_folder": "05_Customers/FOLDER_NAME",
|
||||
"created_at": "ISO",
|
||||
"updated_at": "ISO"
|
||||
}
|
||||
```
|
||||
|
||||
### `crm_orders` (Firestore)
|
||||
```json
|
||||
{
|
||||
"id": "auto",
|
||||
"customer_id": "ref",
|
||||
"order_number": "ORD-2026-001",
|
||||
"status": "draft",
|
||||
"items": [
|
||||
{
|
||||
"type": "console_device|product|freetext",
|
||||
"product_id": "",
|
||||
"product_name": "",
|
||||
"description": "",
|
||||
"quantity": 1,
|
||||
"unit_price": 0.0,
|
||||
"serial_numbers": []
|
||||
}
|
||||
],
|
||||
"subtotal": 0.0,
|
||||
"discount": { "type": "percentage|fixed", "value": 0, "reason": "" },
|
||||
"total_price": 0.0,
|
||||
"currency": "EUR",
|
||||
"shipping": {
|
||||
"method": "",
|
||||
"tracking_number": "",
|
||||
"carrier": "",
|
||||
"shipped_at": null,
|
||||
"delivered_at": null,
|
||||
"destination": ""
|
||||
},
|
||||
"payment_status": "pending",
|
||||
"invoice_path": "",
|
||||
"notes": "",
|
||||
"created_at": "ISO",
|
||||
"updated_at": "ISO"
|
||||
}
|
||||
```
|
||||
|
||||
### `crm_products` (Firestore)
|
||||
```json
|
||||
{
|
||||
"id": "auto",
|
||||
"name": "Vesper Plus",
|
||||
"sku": "VSP-001",
|
||||
"category": "controller|striker|clock|part|repair_service",
|
||||
"description": "",
|
||||
"price": 0.0,
|
||||
"currency": "EUR",
|
||||
"costs": {
|
||||
"pcb": 0.0, "components": 0.0, "enclosure": 0.0,
|
||||
"labor_hours": 0, "labor_rate": 0.0, "shipping_in": 0.0,
|
||||
"total": 0.0
|
||||
},
|
||||
"stock": { "on_hand": 0, "reserved": 0, "available": 0 },
|
||||
"nextcloud_folder": "02_Products/FOLDER",
|
||||
"linked_device_type": "",
|
||||
"active": true,
|
||||
"created_at": "ISO",
|
||||
"updated_at": "ISO"
|
||||
}
|
||||
```
|
||||
|
||||
### `crm_comms_log` (SQLite table — existing mqtt_data.db)
|
||||
```sql
|
||||
CREATE TABLE crm_comms_log (
|
||||
id TEXT PRIMARY KEY,
|
||||
customer_id TEXT NOT NULL,
|
||||
type TEXT NOT NULL, -- email|whatsapp|call|sms|note|in_person
|
||||
direction TEXT NOT NULL, -- inbound|outbound|internal
|
||||
subject TEXT,
|
||||
body TEXT,
|
||||
attachments TEXT, -- JSON array of {filename, nextcloud_path}
|
||||
ext_message_id TEXT, -- IMAP uid, WhatsApp msg id, AMI call id
|
||||
logged_by TEXT,
|
||||
occurred_at TEXT NOT NULL,
|
||||
created_at TEXT NOT NULL
|
||||
);
|
||||
```
|
||||
|
||||
### `crm_media` (SQLite table — existing mqtt_data.db)
|
||||
```sql
|
||||
CREATE TABLE crm_media (
|
||||
id TEXT PRIMARY KEY,
|
||||
customer_id TEXT,
|
||||
order_id TEXT,
|
||||
filename TEXT NOT NULL,
|
||||
nextcloud_path TEXT NOT NULL,
|
||||
mime_type TEXT,
|
||||
direction TEXT, -- received|sent|internal
|
||||
tags TEXT, -- JSON array
|
||||
uploaded_by TEXT,
|
||||
created_at TEXT NOT NULL
|
||||
);
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## IMPORTANT NOTES FOR ALL STEPS
|
||||
|
||||
- **Backend location**: `c:\development\bellsystems-cp\backend\`
|
||||
- **Frontend location**: `c:\development\bellsystems-cp\frontend\`
|
||||
- **Auth pattern**: All routes use `Depends(require_permission("crm", "view"))` or `"edit"`. Import from `auth.dependencies`.
|
||||
- **Firestore pattern**: Use `from shared.firebase import get_db`. See `backend/devices/service.py` for reference patterns.
|
||||
- **SQLite pattern**: Use `from mqtt import database as mqtt_db` — `mqtt_db.db` is the aiosqlite connection. See `backend/mqtt/database.py`.
|
||||
- **Frontend auth**: `getAuthHeaders()` from `../api/auth` gives Bearer token headers. See any existing page for pattern.
|
||||
- **Frontend routing**: Routes live in `frontend/src/App.jsx`. Sidebar nav in `frontend/src/layout/Sidebar.jsx`.
|
||||
- **Token**: localStorage key is `"access_token"`.
|
||||
- **UI pattern**: Use existing component style — `SectionCard`, `FieldRow`, inline styles for grids. See `frontend/src/devices/` for reference.
|
||||
- **No new dependencies unless absolutely necessary.**
|
||||
|
||||
---
|
||||
|
||||
## Step 1 — Backend: CRM Module Scaffold + Products CRUD
|
||||
|
||||
**File**: `.claude/crm-step-01.md`
|
||||
|
||||
---
|
||||
|
||||
## Step 2 — Backend: Customers CRUD
|
||||
|
||||
**File**: `.claude/crm-step-02.md`
|
||||
|
||||
---
|
||||
|
||||
## Step 3 — Backend: Orders CRUD
|
||||
|
||||
**File**: `.claude/crm-step-03.md`
|
||||
|
||||
---
|
||||
|
||||
## Step 4 — Backend: Comms Log + Media (SQLite)
|
||||
|
||||
**File**: `.claude/crm-step-04.md`
|
||||
|
||||
---
|
||||
|
||||
## Step 5 — Frontend: Products Module
|
||||
|
||||
**File**: `.claude/crm-step-05.md`
|
||||
|
||||
---
|
||||
|
||||
## Step 6 — Frontend: Customers List + Detail Page
|
||||
|
||||
**File**: `.claude/crm-step-06.md`
|
||||
|
||||
---
|
||||
|
||||
## Step 7 — Frontend: Orders Module
|
||||
|
||||
**File**: `.claude/crm-step-07.md`
|
||||
|
||||
---
|
||||
|
||||
## Step 8 — Frontend: Comms Log + Media Tab (manual entry)
|
||||
|
||||
**File**: `.claude/crm-step-08.md`
|
||||
|
||||
---
|
||||
|
||||
## Step 9 — Integration: Nextcloud WebDAV
|
||||
|
||||
**File**: `.claude/crm-step-09.md`
|
||||
|
||||
---
|
||||
|
||||
## Step 10 — Integration: IMAP/SMTP Email
|
||||
|
||||
**File**: `.claude/crm-step-10.md`
|
||||
|
||||
---
|
||||
|
||||
## Step 11 — Integration: WhatsApp Business API
|
||||
|
||||
**File**: `.claude/crm-step-11.md`
|
||||
|
||||
---
|
||||
|
||||
## Step 12 — Integration: FreePBX AMI Call Logging
|
||||
|
||||
**File**: `.claude/crm-step-12.md`
|
||||
49
.claude/crm-step-01.md
Normal file
@@ -0,0 +1,49 @@
|
||||
# CRM Step 01 — Backend: Module Scaffold + Products CRUD
|
||||
|
||||
## Context
|
||||
Read `.claude/crm-build-plan.md` first for full data models, conventions, and IMPORTANT NOTES.
|
||||
|
||||
## Task
|
||||
Create the `backend/crm/` module with Products CRUD. This is the first CRM backend step.
|
||||
|
||||
## What to build
|
||||
|
||||
### 1. `backend/crm/__init__.py` — empty
|
||||
|
||||
### 2. `backend/crm/models.py`
|
||||
Pydantic models for Products:
|
||||
- `ProductCosts` — pcb, components, enclosure, labor_hours, labor_rate, shipping_in, total (all float/int, all optional)
|
||||
- `ProductStock` — on_hand, reserved, available (int, defaults 0)
|
||||
- `ProductCategory` enum — controller, striker, clock, part, repair_service
|
||||
- `ProductCreate` — name, sku (optional), category, description (optional), price (float), currency (default "EUR"), costs (ProductCosts optional), stock (ProductStock optional), nextcloud_folder (optional), linked_device_type (optional), active (bool default True)
|
||||
- `ProductUpdate` — all fields Optional
|
||||
- `ProductInDB` — extends ProductCreate + id (str), created_at (str), updated_at (str)
|
||||
- `ProductListResponse` — products: List[ProductInDB], total: int
|
||||
|
||||
### 3. `backend/crm/service.py`
|
||||
Firestore collection: `crm_products`
|
||||
Functions:
|
||||
- `list_products(search=None, category=None, active_only=False) -> List[ProductInDB]`
|
||||
- `get_product(product_id) -> ProductInDB` — raises HTTPException 404 if not found
|
||||
- `create_product(data: ProductCreate) -> ProductInDB` — generates UUID id, sets created_at/updated_at to ISO now
|
||||
- `update_product(product_id, data: ProductUpdate) -> ProductInDB` — partial update (only set fields), updates updated_at
|
||||
- `delete_product(product_id) -> None` — raises 404 if not found
|
||||
|
||||
### 4. `backend/crm/router.py`
|
||||
Prefix: `/api/crm/products`, tag: `crm-products`
|
||||
All routes require `require_permission("crm", "view")` for GET, `require_permission("crm", "edit")` for POST/PUT/DELETE.
|
||||
- `GET /` → list_products (query params: search, category, active_only)
|
||||
- `GET /{product_id}` → get_product
|
||||
- `POST /` → create_product
|
||||
- `PUT /{product_id}` → update_product
|
||||
- `DELETE /{product_id}` → delete_product
|
||||
|
||||
### 5. Register in `backend/main.py`
|
||||
Add: `from crm.router import router as crm_products_router`
|
||||
Add: `app.include_router(crm_products_router)` (after existing routers)
|
||||
|
||||
## Notes
|
||||
- Use `uuid.uuid4()` for IDs
|
||||
- Use `datetime.utcnow().isoformat()` for timestamps
|
||||
- Follow exact Firestore pattern from `backend/devices/service.py`
|
||||
- No new pip dependencies needed
|
||||
61
.claude/crm-step-02.md
Normal file
@@ -0,0 +1,61 @@
|
||||
# CRM Step 02 — Backend: Customers CRUD
|
||||
|
||||
## Context
|
||||
Read `.claude/crm-build-plan.md` first for full data models, conventions, and IMPORTANT NOTES.
|
||||
Step 01 must be complete (`backend/crm/` module exists).
|
||||
|
||||
## Task
|
||||
Add Customers models, service, and router to `backend/crm/`.
|
||||
|
||||
## What to build
|
||||
|
||||
### 1. Add to `backend/crm/models.py`
|
||||
|
||||
**Contact entry:**
|
||||
- `ContactType` enum — email, phone, whatsapp, other
|
||||
- `CustomerContact` — type (ContactType), label (str, e.g. "personal"/"church"), value (str), primary (bool default False)
|
||||
|
||||
**Note entry:**
|
||||
- `CustomerNote` — text (str), by (str), at (str ISO datetime)
|
||||
|
||||
**Owned items (3 tiers):**
|
||||
- `OwnedItemType` enum — console_device, product, freetext
|
||||
- `OwnedItem`:
|
||||
- type: OwnedItemType
|
||||
- For console_device: device_id (Optional[str]), label (Optional[str])
|
||||
- For product: product_id (Optional[str]), product_name (Optional[str]), quantity (Optional[int]), serial_numbers (Optional[List[str]])
|
||||
- For freetext: description (Optional[str]), serial_number (Optional[str]), notes (Optional[str])
|
||||
|
||||
**Location:**
|
||||
- `CustomerLocation` — city (Optional[str]), country (Optional[str]), region (Optional[str])
|
||||
|
||||
**Customer models:**
|
||||
- `CustomerCreate` — name (str), organization (Optional[str]), contacts (List[CustomerContact] default []), notes (List[CustomerNote] default []), location (Optional[CustomerLocation]), language (str default "el"), tags (List[str] default []), owned_items (List[OwnedItem] default []), linked_user_ids (List[str] default []), nextcloud_folder (Optional[str])
|
||||
- `CustomerUpdate` — all fields Optional
|
||||
- `CustomerInDB` — extends CustomerCreate + id, created_at, updated_at
|
||||
- `CustomerListResponse` — customers: List[CustomerInDB], total: int
|
||||
|
||||
### 2. Add to `backend/crm/service.py`
|
||||
Firestore collection: `crm_customers`
|
||||
Functions:
|
||||
- `list_customers(search=None, tag=None) -> List[CustomerInDB]`
|
||||
- search matches against name, organization, and any contact value
|
||||
- `get_customer(customer_id) -> CustomerInDB` — 404 if not found
|
||||
- `create_customer(data: CustomerCreate) -> CustomerInDB`
|
||||
- `update_customer(customer_id, data: CustomerUpdate) -> CustomerInDB`
|
||||
- `delete_customer(customer_id) -> None`
|
||||
|
||||
### 3. Add to `backend/crm/router.py`
|
||||
Add a second router or extend existing file with prefix `/api/crm/customers`:
|
||||
- `GET /` — list_customers (query: search, tag)
|
||||
- `GET /{customer_id}` — get_customer
|
||||
- `POST /` — create_customer
|
||||
- `PUT /{customer_id}` — update_customer
|
||||
- `DELETE /{customer_id}` — delete_customer
|
||||
|
||||
Register this router in `backend/main.py` alongside the products router.
|
||||
|
||||
## Notes
|
||||
- OwnedItem is a flexible struct — store all fields, service doesn't validate which fields are relevant per type (frontend handles that)
|
||||
- linked_user_ids are Firebase Auth UIDs (strings) — no validation needed here, just store them
|
||||
- Search in list_customers: do client-side filter after fetching all (small dataset)
|
||||
60
.claude/crm-step-03.md
Normal file
@@ -0,0 +1,60 @@
|
||||
# CRM Step 03 — Backend: Orders CRUD
|
||||
|
||||
## Context
|
||||
Read `.claude/crm-build-plan.md` first for full data models, conventions, and IMPORTANT NOTES.
|
||||
Steps 01 and 02 must be complete.
|
||||
|
||||
## Task
|
||||
Add Orders models, service, and router to `backend/crm/`.
|
||||
|
||||
## What to build
|
||||
|
||||
### 1. Add to `backend/crm/models.py`
|
||||
|
||||
**Enums:**
|
||||
- `OrderStatus` — draft, confirmed, in_production, shipped, delivered, cancelled
|
||||
- `PaymentStatus` — pending, partial, paid
|
||||
|
||||
**Structs:**
|
||||
- `OrderDiscount` — type (str: "percentage" | "fixed"), value (float default 0), reason (Optional[str])
|
||||
- `OrderShipping` — method (Optional[str]), tracking_number (Optional[str]), carrier (Optional[str]), shipped_at (Optional[str]), delivered_at (Optional[str]), destination (Optional[str])
|
||||
- `OrderItem`:
|
||||
- type: str (console_device | product | freetext)
|
||||
- product_id: Optional[str]
|
||||
- product_name: Optional[str]
|
||||
- description: Optional[str] ← for freetext items
|
||||
- quantity: int default 1
|
||||
- unit_price: float default 0.0
|
||||
- serial_numbers: List[str] default []
|
||||
|
||||
**Order models:**
|
||||
- `OrderCreate` — customer_id (str), order_number (Optional[str] — auto-generated if not provided), status (OrderStatus default draft), items (List[OrderItem] default []), subtotal (float default 0), discount (Optional[OrderDiscount]), total_price (float default 0), currency (str default "EUR"), shipping (Optional[OrderShipping]), payment_status (PaymentStatus default pending), invoice_path (Optional[str]), notes (Optional[str])
|
||||
- `OrderUpdate` — all fields Optional
|
||||
- `OrderInDB` — extends OrderCreate + id, created_at, updated_at
|
||||
- `OrderListResponse` — orders: List[OrderInDB], total: int
|
||||
|
||||
### 2. Add to `backend/crm/service.py`
|
||||
Firestore collection: `crm_orders`
|
||||
|
||||
Auto order number generation: `ORD-{YYYY}-{NNN}` — query existing orders for current year, increment max.
|
||||
|
||||
Functions:
|
||||
- `list_orders(customer_id=None, status=None, payment_status=None) -> List[OrderInDB]`
|
||||
- `get_order(order_id) -> OrderInDB` — 404 if not found
|
||||
- `create_order(data: OrderCreate) -> OrderInDB` — auto-generate order_number if not set
|
||||
- `update_order(order_id, data: OrderUpdate) -> OrderInDB`
|
||||
- `delete_order(order_id) -> None`
|
||||
|
||||
### 3. Add to `backend/crm/router.py`
|
||||
Prefix `/api/crm/orders`:
|
||||
- `GET /` — list_orders (query: customer_id, status, payment_status)
|
||||
- `GET /{order_id}` — get_order
|
||||
- `POST /` — create_order
|
||||
- `PUT /{order_id}` — update_order
|
||||
- `DELETE /{order_id}` — delete_order
|
||||
|
||||
Register in `backend/main.py`.
|
||||
|
||||
## Notes
|
||||
- subtotal and total_price are stored as-is (calculated by frontend before POST/PUT). Backend does not recalculate.
|
||||
- Order number generation doesn't need to be atomic/perfect — just a best-effort sequential label.
|
||||
96
.claude/crm-step-04.md
Normal file
@@ -0,0 +1,96 @@
|
||||
# CRM Step 04 — Backend: Comms Log + Media (SQLite)
|
||||
|
||||
## Context
|
||||
Read `.claude/crm-build-plan.md` for full schema, conventions, and IMPORTANT NOTES.
|
||||
Steps 01–03 must be complete.
|
||||
|
||||
## Task
|
||||
Add `crm_comms_log` and `crm_media` tables to the existing SQLite DB, plus CRUD endpoints.
|
||||
|
||||
## What to build
|
||||
|
||||
### 1. Add tables to `backend/mqtt/database.py`
|
||||
Inside `init_db()`, add these CREATE TABLE IF NOT EXISTS statements alongside existing tables:
|
||||
|
||||
```sql
|
||||
CREATE TABLE IF NOT EXISTS crm_comms_log (
|
||||
id TEXT PRIMARY KEY,
|
||||
customer_id TEXT NOT NULL,
|
||||
type TEXT NOT NULL,
|
||||
direction TEXT NOT NULL,
|
||||
subject TEXT,
|
||||
body TEXT,
|
||||
attachments TEXT DEFAULT '[]',
|
||||
ext_message_id TEXT,
|
||||
logged_by TEXT,
|
||||
occurred_at TEXT NOT NULL,
|
||||
created_at TEXT NOT NULL
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS crm_media (
|
||||
id TEXT PRIMARY KEY,
|
||||
customer_id TEXT,
|
||||
order_id TEXT,
|
||||
filename TEXT NOT NULL,
|
||||
nextcloud_path TEXT NOT NULL,
|
||||
mime_type TEXT,
|
||||
direction TEXT,
|
||||
tags TEXT DEFAULT '[]',
|
||||
uploaded_by TEXT,
|
||||
created_at TEXT NOT NULL
|
||||
);
|
||||
```
|
||||
|
||||
### 2. Add to `backend/crm/models.py`
|
||||
|
||||
**Comms:**
|
||||
- `CommType` enum — email, whatsapp, call, sms, note, in_person
|
||||
- `CommDirection` enum — inbound, outbound, internal
|
||||
- `CommAttachment` — filename (str), nextcloud_path (str)
|
||||
- `CommCreate` — customer_id, type (CommType), direction (CommDirection), subject (Optional[str]), body (Optional[str]), attachments (List[CommAttachment] default []), ext_message_id (Optional[str]), logged_by (Optional[str]), occurred_at (str ISO — default to now if not provided)
|
||||
- `CommUpdate` — subject, body, occurred_at all Optional
|
||||
- `CommInDB` — all fields + id, created_at
|
||||
- `CommListResponse` — entries: List[CommInDB], total: int
|
||||
|
||||
**Media:**
|
||||
- `MediaDirection` enum — received, sent, internal
|
||||
- `MediaCreate` — customer_id (Optional[str]), order_id (Optional[str]), filename, nextcloud_path, mime_type (Optional), direction (MediaDirection optional), tags (List[str] default []), uploaded_by (Optional[str])
|
||||
- `MediaInDB` — all fields + id, created_at
|
||||
- `MediaListResponse` — items: List[MediaInDB], total: int
|
||||
|
||||
### 3. Add to `backend/crm/service.py`
|
||||
Import `from mqtt import database as mqtt_db` for aiosqlite access.
|
||||
|
||||
**Comms functions (all async):**
|
||||
- `list_comms(customer_id, type=None, direction=None, limit=100) -> List[CommInDB]`
|
||||
— SELECT ... WHERE customer_id=? ORDER BY occurred_at DESC
|
||||
- `get_comm(comm_id) -> CommInDB` — 404 if not found
|
||||
- `create_comm(data: CommCreate) -> CommInDB` — uuid id, created_at now, store attachments as JSON string
|
||||
- `update_comm(comm_id, data: CommUpdate) -> CommInDB`
|
||||
- `delete_comm(comm_id) -> None`
|
||||
|
||||
**Media functions (all async):**
|
||||
- `list_media(customer_id=None, order_id=None) -> List[MediaInDB]`
|
||||
- `create_media(data: MediaCreate) -> MediaInDB`
|
||||
- `delete_media(media_id) -> None`
|
||||
|
||||
Parse `attachments` and `tags` JSON strings back to lists when returning models.
|
||||
|
||||
### 4. Add to `backend/crm/router.py`
|
||||
Prefix `/api/crm/comms`:
|
||||
- `GET /` — list_comms (query: customer_id required, type, direction)
|
||||
- `POST /` — create_comm
|
||||
- `PUT /{comm_id}` — update_comm
|
||||
- `DELETE /{comm_id}` — delete_comm
|
||||
|
||||
Prefix `/api/crm/media`:
|
||||
- `GET /` — list_media (query: customer_id or order_id)
|
||||
- `POST /` — create_media (metadata only — no file upload here, that's Step 9)
|
||||
- `DELETE /{media_id}` — delete_media
|
||||
|
||||
Register both in `backend/main.py`.
|
||||
|
||||
## Notes
|
||||
- Use `mqtt_db.db` — it is an aiosqlite connection, use `async with mqtt_db.db.execute(...)` pattern
|
||||
- Look at `backend/mqtt/database.py` for exact aiosqlite usage pattern
|
||||
- attachments and tags are stored as JSON strings in SQLite, deserialized to lists in the Pydantic model
|
||||
55
.claude/crm-step-05.md
Normal file
@@ -0,0 +1,55 @@
|
||||
# CRM Step 05 — Frontend: Products Module
|
||||
|
||||
## Context
|
||||
Read `.claude/crm-build-plan.md` for full context and IMPORTANT NOTES.
|
||||
Backend Steps 01–04 must be complete and running.
|
||||
|
||||
## Task
|
||||
Build the Products section of the CRM frontend.
|
||||
|
||||
## Files to create
|
||||
|
||||
### `frontend/src/crm/products/ProductList.jsx`
|
||||
- Fetch `GET /api/crm/products` with auth headers
|
||||
- Show a table/list: Name, SKU, Category, Price, Stock (available), Active badge
|
||||
- Search input (client-side filter on name/sku)
|
||||
- Filter dropdown for category
|
||||
- "New Product" button → navigate to `/crm/products/new`
|
||||
- Row click → navigate to `/crm/products/:id`
|
||||
|
||||
### `frontend/src/crm/products/ProductForm.jsx`
|
||||
Used for both create and edit. Receives `productId` prop (null = create mode).
|
||||
Fields:
|
||||
- name (required), sku, category (dropdown from enum), description (textarea)
|
||||
- price (number), currency (default EUR)
|
||||
- Costs section (collapsible): pcb, components, enclosure, labor_hours, labor_rate, shipping_in — show computed total
|
||||
- Stock section: on_hand, reserved — show available = on_hand - reserved (readonly)
|
||||
- nextcloud_folder, linked_device_type, active (toggle)
|
||||
- Save / Cancel buttons
|
||||
- In edit mode: show Delete button with confirmation
|
||||
|
||||
On save: POST `/api/crm/products` or PUT `/api/crm/products/:id`
|
||||
On delete: DELETE `/api/crm/products/:id` then navigate back to list
|
||||
|
||||
### `frontend/src/crm/products/index.js`
|
||||
Export both components.
|
||||
|
||||
## Routing
|
||||
In `frontend/src/App.jsx` add:
|
||||
```jsx
|
||||
<Route path="/crm/products" element={<ProductList />} />
|
||||
<Route path="/crm/products/new" element={<ProductForm />} />
|
||||
<Route path="/crm/products/:id" element={<ProductForm />} />
|
||||
```
|
||||
|
||||
## Sidebar
|
||||
In `frontend/src/layout/Sidebar.jsx` add a "CRM" section with:
|
||||
- Products → `/crm/products`
|
||||
(Customers and Orders will be added in later steps)
|
||||
|
||||
## Notes
|
||||
- Use existing UI patterns: SectionCard wrapper, inline styles for layout grid
|
||||
- Follow the same auth header pattern as other frontend modules (getAuthHeaders from `../api/auth` or equivalent)
|
||||
- Currency is always EUR for now — no need for a selector
|
||||
- Computed costs total = pcb + components + enclosure + (labor_hours * labor_rate) + shipping_in, shown live as user types
|
||||
- Category values: controller, striker, clock, part, repair_service — display as human-readable labels
|
||||
84
.claude/crm-step-06.md
Normal file
@@ -0,0 +1,84 @@
|
||||
# CRM Step 06 — Frontend: Customers List + Detail Page
|
||||
|
||||
## Context
|
||||
Read `.claude/crm-build-plan.md` for full context, data models, and IMPORTANT NOTES.
|
||||
Backend Steps 01–04 and Frontend Step 05 must be complete.
|
||||
|
||||
## Task
|
||||
Build the Customers section — the core of the CRM.
|
||||
|
||||
## Files to create
|
||||
|
||||
### `frontend/src/crm/customers/CustomerList.jsx`
|
||||
- Fetch `GET /api/crm/customers` (query: search, tag)
|
||||
- Show cards or table rows: Name, Organization, Location, Tags, primary contact
|
||||
- Search input → query param `search`
|
||||
- "New Customer" button → `/crm/customers/new`
|
||||
- Row/card click → `/crm/customers/:id`
|
||||
|
||||
### `frontend/src/crm/customers/CustomerForm.jsx`
|
||||
Create/edit form. Receives `customerId` prop (null = create).
|
||||
|
||||
**Sections:**
|
||||
1. **Basic Info** — name, organization, language, tags (pill input), nextcloud_folder
|
||||
2. **Location** — city, country, region
|
||||
3. **Contacts** — dynamic list of `{ type, label, value, primary }` entries. Add/remove rows. Radio to set primary per type group.
|
||||
4. **Notes** — dynamic list of `{ text, by, at }`. Add new note button. Existing notes shown as read-only with author/date. `by` auto-filled from current user name.
|
||||
5. **Owned Items** — dynamic list with type selector:
|
||||
- `console_device`: device_id text input + label
|
||||
- `product`: product selector (fetch `/api/crm/products` for dropdown) + quantity + serial_numbers (comma-separated input)
|
||||
- `freetext`: description + serial_number + notes
|
||||
Add/remove rows.
|
||||
6. **Linked App Accounts** — list of Firebase UIDs (simple text inputs, add/remove). Label: "Linked App User IDs"
|
||||
|
||||
Save: POST or PUT. Delete with confirmation.
|
||||
|
||||
### `frontend/src/crm/customers/CustomerDetail.jsx`
|
||||
The main customer page. Fetches customer by ID. Tab layout:
|
||||
|
||||
**Tab 1: Overview**
|
||||
- Show all info from CustomerForm fields in read-only view
|
||||
- "Edit" button → opens CustomerForm in a modal or navigates to edit route
|
||||
|
||||
**Tab 2: Orders**
|
||||
- Fetch `GET /api/crm/orders?customer_id=:id`
|
||||
- List orders: order_number, status badge, total_price, date
|
||||
- "New Order" button → navigate to `/crm/orders/new?customer_id=:id`
|
||||
- Row click → `/crm/orders/:id`
|
||||
|
||||
**Tab 3: Comms**
|
||||
- Fetch `GET /api/crm/comms?customer_id=:id`
|
||||
- Timeline view sorted by occurred_at descending
|
||||
- Each entry shows: type icon, direction indicator, subject/body preview, date
|
||||
- "Log Entry" button → inline form to create a new comms entry (type, direction, subject, body, occurred_at)
|
||||
|
||||
**Tab 4: Media**
|
||||
- Fetch `GET /api/crm/media?customer_id=:id`
|
||||
- Grid of files: filename, direction badge (Received/Sent/Internal), date
|
||||
- "Add Media Record" button → form with filename, nextcloud_path, direction, tags (manual entry for now — Nextcloud integration comes in Step 9)
|
||||
|
||||
**Tab 5: Devices** (read-only summary)
|
||||
- Display `owned_items` grouped by type
|
||||
- For console_device items: link to `/devices/:device_id` in a new tab
|
||||
|
||||
### `frontend/src/crm/customers/index.js`
|
||||
Export all components.
|
||||
|
||||
## Routing in `frontend/src/App.jsx`
|
||||
```jsx
|
||||
<Route path="/crm/customers" element={<CustomerList />} />
|
||||
<Route path="/crm/customers/new" element={<CustomerForm />} />
|
||||
<Route path="/crm/customers/:id" element={<CustomerDetail />} />
|
||||
<Route path="/crm/customers/:id/edit" element={<CustomerForm />} />
|
||||
```
|
||||
|
||||
## Sidebar update
|
||||
Add to CRM section:
|
||||
- Customers → `/crm/customers`
|
||||
|
||||
## Notes
|
||||
- ALL hooks in CustomerDetail must be before any early returns (loading/error states)
|
||||
- Tag input: comma or enter to add, click pill to remove
|
||||
- Contact type icons: use simple text labels or emoji (📧 📞 💬) — keep it simple
|
||||
- Comms type icons: simple colored badges per type (email=blue, whatsapp=green, call=yellow, note=grey)
|
||||
- No file upload UI yet in Media tab — just nextcloud_path text field for now (Step 9 adds real upload)
|
||||
71
.claude/crm-step-07.md
Normal file
@@ -0,0 +1,71 @@
|
||||
# CRM Step 07 — Frontend: Orders Module
|
||||
|
||||
## Context
|
||||
Read `.claude/crm-build-plan.md` for full context, data models, and IMPORTANT NOTES.
|
||||
Steps 01–06 must be complete.
|
||||
|
||||
## Task
|
||||
Build the Orders section.
|
||||
|
||||
## Files to create
|
||||
|
||||
### `frontend/src/crm/orders/OrderList.jsx`
|
||||
- Fetch `GET /api/crm/orders` (query: status, payment_status)
|
||||
- Table: Order #, Customer name (resolve from customer_id via separate fetch or denormalize), Status badge, Total, Payment status, Date
|
||||
- Filter dropdowns: Status, Payment Status
|
||||
- "New Order" button → `/crm/orders/new`
|
||||
- Row click → `/crm/orders/:id`
|
||||
|
||||
### `frontend/src/crm/orders/OrderForm.jsx`
|
||||
Create/edit. Receives `orderId` prop and optional `customerId` from query param.
|
||||
|
||||
**Sections:**
|
||||
1. **Customer** — searchable dropdown (fetch `/api/crm/customers`). Shows name + organization.
|
||||
2. **Order Info** — order_number (auto, editable), status (dropdown), currency
|
||||
3. **Items** — dynamic list. Each item:
|
||||
- type selector: console_device | product | freetext
|
||||
- product: dropdown from `/api/crm/products` (auto-fills product_name, unit_price)
|
||||
- console_device: text input for device_id + label
|
||||
- freetext: description text input
|
||||
- quantity (number), unit_price (number), serial_numbers (comma-separated)
|
||||
- Remove row button
|
||||
- Add Item button
|
||||
4. **Pricing** — show computed subtotal (sum of qty * unit_price). Discount: type toggle (% or fixed) + value input + reason. Show computed total = subtotal - discount. These values are sent to backend as-is.
|
||||
5. **Payment** — payment_status dropdown, invoice_path (nextcloud path text input)
|
||||
6. **Shipping** — method, carrier, tracking_number, destination, shipped_at (date), delivered_at (date)
|
||||
7. **Notes** — textarea
|
||||
|
||||
Save → POST or PUT. Delete with confirmation.
|
||||
|
||||
### `frontend/src/crm/orders/OrderDetail.jsx`
|
||||
Read-only view of a single order.
|
||||
- Header: order number, status badge, customer name (link to customer)
|
||||
- Items table: product/description, qty, unit price, line total
|
||||
- Pricing summary: subtotal, discount, total
|
||||
- Shipping card: all shipping fields
|
||||
- Payment card: status, invoice path (if set, show as link)
|
||||
- Notes
|
||||
- Edit button → OrderForm
|
||||
- Back to customer button
|
||||
|
||||
### `frontend/src/crm/orders/index.js`
|
||||
Export all components.
|
||||
|
||||
## Routing in `frontend/src/App.jsx`
|
||||
```jsx
|
||||
<Route path="/crm/orders" element={<OrderList />} />
|
||||
<Route path="/crm/orders/new" element={<OrderForm />} />
|
||||
<Route path="/crm/orders/:id" element={<OrderDetail />} />
|
||||
<Route path="/crm/orders/:id/edit" element={<OrderForm />} />
|
||||
```
|
||||
|
||||
## Sidebar update
|
||||
Add to CRM section:
|
||||
- Orders → `/crm/orders`
|
||||
|
||||
## Notes
|
||||
- Status badge colors: draft=grey, confirmed=blue, in_production=orange, shipped=purple, delivered=green, cancelled=red
|
||||
- Payment status: pending=yellow, partial=orange, paid=green
|
||||
- Discount calculation: if type=percentage → total = subtotal * (1 - value/100). if type=fixed → total = subtotal - value
|
||||
- When a product is selected from dropdown in item row, auto-fill unit_price from product.price (user can override)
|
||||
- Order list needs customer names — either fetch all customers once and build a map, or add customer_name as a denormalized field when creating/updating orders (simpler: fetch customer list once)
|
||||
53
.claude/crm-step-08.md
Normal file
@@ -0,0 +1,53 @@
|
||||
# CRM Step 08 — Frontend: Comms Log + Media (Manual Entry Polish)
|
||||
|
||||
## Context
|
||||
Read `.claude/crm-build-plan.md` for full context and IMPORTANT NOTES.
|
||||
Steps 01–07 must be complete.
|
||||
|
||||
## Task
|
||||
Two things:
|
||||
1. A standalone **Inbox** page — unified comms view across all customers
|
||||
2. Polish the Comms and Media tabs on CustomerDetail (from Step 06) — improve the UI
|
||||
|
||||
## Files to create/update
|
||||
|
||||
### `frontend/src/crm/inbox/InboxPage.jsx`
|
||||
- Fetch `GET /api/crm/comms?customer_id=ALL` — wait, this doesn't exist yet.
|
||||
→ Instead, fetch all customers, then fetch comms for each? No — too many requests.
|
||||
→ Add a new backend endpoint first (see below).
|
||||
- Show a unified timeline of all comms entries across all customers, sorted by occurred_at desc
|
||||
- Each entry shows: customer name (link), type badge, direction, subject/body preview, date
|
||||
- Filter by type (email/whatsapp/call/note/etc), direction, customer (dropdown)
|
||||
- Pagination or virtual scroll (limit to last 100 entries)
|
||||
|
||||
### Backend addition needed — add to `backend/crm/router.py` and `service.py`:
|
||||
`GET /api/crm/comms/all` — fetch all comms (no customer_id filter), sorted by occurred_at DESC, limit 200.
|
||||
`list_all_comms(type=None, direction=None, limit=200) -> List[CommInDB]` in service.
|
||||
|
||||
### Comms tab improvements (update CustomerDetail.jsx)
|
||||
- Full timeline view with visual connector line between entries
|
||||
- Each entry is expandable — click to see full body
|
||||
- Entry form as an inline slide-down panel (not a modal)
|
||||
- Form fields: type (icons + labels), direction, subject, body (textarea), occurred_at (datetime-local input, defaults to now), attachments (add nextcloud_path manually for now)
|
||||
- After save, refresh comms list
|
||||
|
||||
### Media tab improvements (update CustomerDetail.jsx)
|
||||
- Group media by direction: "Received" section, "Sent" section, "Internal" section
|
||||
- Show filename, tags as pills, date
|
||||
- "Add Media" inline form: filename (required), nextcloud_path (required), direction (dropdown), tags (pill input)
|
||||
- Delete button per item with confirmation
|
||||
|
||||
## Routing in `frontend/src/App.jsx`
|
||||
```jsx
|
||||
<Route path="/crm/inbox" element={<InboxPage />} />
|
||||
```
|
||||
|
||||
## Sidebar update
|
||||
Add to CRM section (at top of CRM group):
|
||||
- Inbox → `/crm/inbox`
|
||||
|
||||
## Notes
|
||||
- This step is mostly UI polish + the inbox page. No new integrations.
|
||||
- The inbox page is the "central comms view" from the original requirements — all messages in one place
|
||||
- Keep the comms entry form simple: only show attachment fields if user clicks "Add attachment"
|
||||
- Type badges: email=blue, whatsapp=green, call=amber, sms=teal, note=grey, in_person=purple
|
||||
92
.claude/crm-step-09.md
Normal file
@@ -0,0 +1,92 @@
|
||||
# CRM Step 09 — Integration: Nextcloud WebDAV
|
||||
|
||||
## Context
|
||||
Read `.claude/crm-build-plan.md` for full context and IMPORTANT NOTES.
|
||||
Steps 01–08 must be complete.
|
||||
|
||||
## Task
|
||||
Connect the console to Nextcloud via WebDAV so that:
|
||||
1. Files in a customer's Nextcloud folder are listed in the Media tab automatically
|
||||
2. Uploading a file from the console sends it to Nextcloud
|
||||
3. Files can be downloaded/previewed via a backend proxy
|
||||
|
||||
## Backend changes
|
||||
|
||||
### 1. Add Nextcloud settings to `backend/config.py`
|
||||
```python
|
||||
nextcloud_url: str = "https://nextcloud.bonamin.gr" # e.g. https://cloud.example.com
|
||||
nextcloud_email: str = "bellsystems.gr@gmail.com"
|
||||
nextcloud_username: str = "bellsystems-console"
|
||||
nextcloud_password: str = "ydE916VdaQdbP2CQGhD!"
|
||||
nextcloud_app_password: str = "rtLCp-NCy3y-gNZdg-38MtN-r8D2N"
|
||||
nextcloud_base_path: str = "BellSystems" # root folder inside Nextcloud
|
||||
```
|
||||
|
||||
### 2. Create `backend/crm/nextcloud.py`
|
||||
WebDAV client using `httpx` (already available). Functions:
|
||||
|
||||
```python
|
||||
async def list_folder(nextcloud_path: str) -> List[dict]:
|
||||
"""
|
||||
PROPFIND request to Nextcloud WebDAV.
|
||||
Returns list of {filename, path, mime_type, size, last_modified, is_dir}
|
||||
Parse the XML response (use xml.etree.ElementTree).
|
||||
URL: {nextcloud_url}/remote.php/dav/files/{username}/{nextcloud_base_path}/{nextcloud_path}
|
||||
"""
|
||||
|
||||
async def upload_file(nextcloud_path: str, filename: str, content: bytes, mime_type: str) -> str:
|
||||
"""
|
||||
PUT request to upload file.
|
||||
Returns the full nextcloud_path of the uploaded file.
|
||||
"""
|
||||
|
||||
async def download_file(nextcloud_path: str) -> tuple[bytes, str]:
|
||||
"""
|
||||
GET request. Returns (content_bytes, mime_type).
|
||||
"""
|
||||
|
||||
async def delete_file(nextcloud_path: str) -> None:
|
||||
"""
|
||||
DELETE request.
|
||||
"""
|
||||
```
|
||||
|
||||
Use HTTP Basic Auth with nextcloud_username/nextcloud_password.
|
||||
If nextcloud_url is empty string, raise HTTPException 503 "Nextcloud not configured".
|
||||
|
||||
### 3. Add to `backend/crm/router.py`
|
||||
|
||||
**Media/Nextcloud endpoints:**
|
||||
|
||||
`GET /api/crm/nextcloud/browse?path=05_Customers/FOLDER`
|
||||
→ calls `list_folder(path)`, returns file list
|
||||
|
||||
`GET /api/crm/nextcloud/file?path=05_Customers/FOLDER/photo.jpg`
|
||||
→ calls `download_file(path)`, returns `Response(content=bytes, media_type=mime_type)`
|
||||
→ This is the proxy endpoint — frontend uses this to display images
|
||||
|
||||
`POST /api/crm/nextcloud/upload`
|
||||
→ accepts `UploadFile` + form field `nextcloud_path` (destination folder)
|
||||
→ calls `upload_file(...)`, then calls `create_media(...)` to save the metadata record
|
||||
→ returns the created `MediaInDB`
|
||||
|
||||
`DELETE /api/crm/nextcloud/file?path=...`
|
||||
→ calls `delete_file(path)`, also deletes the matching `crm_media` record if found
|
||||
|
||||
## Frontend changes
|
||||
|
||||
### Update Media tab in `CustomerDetail.jsx`
|
||||
- On load: if `customer.nextcloud_folder` is set, fetch `GET /api/crm/nextcloud/browse?path={customer.nextcloud_folder}` and merge results with existing `crm_media` records. Show files from both sources — deduplicate by nextcloud_path.
|
||||
- Image files: render as `<img src="/api/crm/nextcloud/file?path=..." />` via the proxy endpoint
|
||||
- Other files: show as a download link hitting the same proxy endpoint
|
||||
- Upload button: file picker → POST to `/api/crm/nextcloud/upload` with file + destination path (default to customer's Sent Media subfolder)
|
||||
- Show upload progress indicator
|
||||
|
||||
### Update Media tab in `CustomerDetail.jsx` — subfolder selector
|
||||
When uploading, let user choose subfolder: "Sent Media" | "Received Media" | "Internal" (maps to direction field too)
|
||||
|
||||
## Notes
|
||||
- `httpx` is likely already in requirements. If not, add it: `httpx>=0.27.0`
|
||||
- PROPFIND response is XML (DAV namespace). Parse `D:response` elements, extract `D:href` and `D:prop` children.
|
||||
- The proxy approach means the VPS never stores files — it just streams them through from Nextcloud
|
||||
- nextcloud_base_path in config allows the root to be `BellSystems` so paths in DB are relative to that root
|
||||
102
.claude/crm-step-10.md
Normal file
@@ -0,0 +1,102 @@
|
||||
# CRM Step 10 — Integration: IMAP/SMTP Email
|
||||
|
||||
## Context
|
||||
Read `.claude/crm-build-plan.md` for full context and IMPORTANT NOTES.
|
||||
Steps 01–09 must be complete.
|
||||
|
||||
## Task
|
||||
Integrate the company email mailbox so that:
|
||||
1. Emails from/to a customer's email addresses appear in their Comms tab
|
||||
2. New emails can be composed and sent from the console
|
||||
3. A background sync runs periodically to pull new emails
|
||||
|
||||
## Backend changes
|
||||
|
||||
### 1. Add email settings to `backend/config.py`
|
||||
```python
|
||||
imap_host: str = ""
|
||||
imap_port: int = 993
|
||||
imap_username: str = ""
|
||||
imap_password: str = ""
|
||||
imap_use_ssl: bool = True
|
||||
smtp_host: str = ""
|
||||
smtp_port: int = 587
|
||||
smtp_username: str = ""
|
||||
smtp_password: str = ""
|
||||
smtp_use_tls: bool = True
|
||||
email_sync_interval_minutes: int = 15
|
||||
```
|
||||
|
||||
### 2. Create `backend/crm/email_sync.py`
|
||||
Using standard library `imaplib` and `email` (no new deps).
|
||||
|
||||
```python
|
||||
async def sync_emails():
|
||||
"""
|
||||
Connect to IMAP. Search UNSEEN or since last sync date.
|
||||
For each email:
|
||||
- Parse from/to/subject/body (text/plain preferred, fallback to stripped HTML)
|
||||
- Check if from-address or to-address matches any customer contact (search crm_customers)
|
||||
- If match found: create crm_comms_log entry with type=email, ext_message_id=message-id header
|
||||
- Skip if ext_message_id already exists in crm_comms_log (dedup)
|
||||
Store last sync time in a simple SQLite table crm_sync_state:
|
||||
CREATE TABLE IF NOT EXISTS crm_sync_state (key TEXT PRIMARY KEY, value TEXT)
|
||||
"""
|
||||
|
||||
async def send_email(to: str, subject: str, body: str, cc: List[str] = []) -> str:
|
||||
"""
|
||||
Send email via SMTP. Returns message-id.
|
||||
After sending, create a crm_comms_log entry: type=email, direction=outbound.
|
||||
"""
|
||||
```
|
||||
|
||||
### 3. Add SQLite table to `backend/mqtt/database.py`
|
||||
```sql
|
||||
CREATE TABLE IF NOT EXISTS crm_sync_state (
|
||||
key TEXT PRIMARY KEY,
|
||||
value TEXT
|
||||
);
|
||||
```
|
||||
|
||||
### 4. Add email endpoints to `backend/crm/router.py`
|
||||
|
||||
`POST /api/crm/email/send`
|
||||
Body: `{ customer_id, to, subject, body, cc (optional) }`
|
||||
→ calls `send_email(...)`, links to customer in comms_log
|
||||
|
||||
`POST /api/crm/email/sync`
|
||||
→ manually trigger `sync_emails()` (for testing / on-demand)
|
||||
→ returns count of new emails found
|
||||
|
||||
### 5. Add background sync to `backend/main.py`
|
||||
In the `startup` event, add a periodic task:
|
||||
```python
|
||||
async def email_sync_loop():
|
||||
while True:
|
||||
await asyncio.sleep(settings.email_sync_interval_minutes * 60)
|
||||
try:
|
||||
from crm.email_sync import sync_emails
|
||||
await sync_emails()
|
||||
except Exception as e:
|
||||
print(f"[EMAIL SYNC] Error: {e}")
|
||||
|
||||
asyncio.create_task(email_sync_loop())
|
||||
```
|
||||
Only start if `settings.imap_host` is set (non-empty).
|
||||
|
||||
## Frontend changes
|
||||
|
||||
### Update Comms tab in `CustomerDetail.jsx`
|
||||
- Email entries show: from/to, subject, body (truncated with expand)
|
||||
- "Compose Email" button → modal with: to (pre-filled from customer primary email), subject, body (textarea), CC
|
||||
- On send: POST `/api/crm/email/send`, add new entry to comms list
|
||||
|
||||
### Update `InboxPage.jsx`
|
||||
- Add "Sync Now" button → POST `/api/crm/email/sync`, show result count toast
|
||||
|
||||
## Notes
|
||||
- `imaplib` is synchronous — wrap in `asyncio.run_in_executor(None, sync_fn)` for the async context
|
||||
- For HTML emails: strip tags with a simple regex or `html.parser` — no need for an HTML renderer
|
||||
- Email body matching: compare email From/To headers against ALL customer contacts where type=email
|
||||
- Don't sync attachments yet — just text content. Attachment handling can be a future step.
|
||||
- If imap_host is empty string, the sync loop doesn't start and the send endpoint returns 503
|
||||
81
.claude/crm-step-11.md
Normal file
@@ -0,0 +1,81 @@
|
||||
# CRM Step 11 — Integration: WhatsApp Business API
|
||||
|
||||
## Context
|
||||
Read `.claude/crm-build-plan.md` for full context and IMPORTANT NOTES.
|
||||
Steps 01–10 must be complete.
|
||||
|
||||
## Prerequisites (manual setup required before this step)
|
||||
- A Meta Business account with WhatsApp Business API enabled
|
||||
- A dedicated phone number registered to WhatsApp Business API (NOT a personal number)
|
||||
- A Meta App with webhook configured to point to: `https://yourdomain.com/api/crm/whatsapp/webhook`
|
||||
- The following values ready: `WHATSAPP_PHONE_NUMBER_ID`, `WHATSAPP_ACCESS_TOKEN`, `WHATSAPP_VERIFY_TOKEN`
|
||||
|
||||
## Task
|
||||
Receive inbound WhatsApp messages via webhook and send outbound messages, all logged to crm_comms_log.
|
||||
|
||||
## Backend changes
|
||||
|
||||
### 1. Add to `backend/config.py`
|
||||
```python
|
||||
whatsapp_phone_number_id: str = ""
|
||||
whatsapp_access_token: str = ""
|
||||
whatsapp_verify_token: str = "change-me" # you set this in Meta webhook config
|
||||
```
|
||||
|
||||
### 2. Create `backend/crm/whatsapp.py`
|
||||
```python
|
||||
async def send_whatsapp(to_phone: str, message: str) -> str:
|
||||
"""
|
||||
POST to https://graph.facebook.com/v19.0/{phone_number_id}/messages
|
||||
Headers: Authorization: Bearer {access_token}
|
||||
Body: { messaging_product: "whatsapp", to: to_phone, type: "text", text: { body: message } }
|
||||
Returns the wamid (WhatsApp message ID).
|
||||
"""
|
||||
```
|
||||
|
||||
### 3. Add webhook + send endpoints to `backend/crm/router.py`
|
||||
|
||||
`GET /api/crm/whatsapp/webhook`
|
||||
— Meta webhook verification. Check `hub.verify_token` == settings.whatsapp_verify_token.
|
||||
Return `hub.challenge` if valid, else 403.
|
||||
**No auth required on this endpoint.**
|
||||
|
||||
`POST /api/crm/whatsapp/webhook`
|
||||
— Receive inbound message events from Meta.
|
||||
**No auth required on this endpoint.**
|
||||
Parse payload:
|
||||
```
|
||||
entry[0].changes[0].value.messages[0]
|
||||
.from → sender phone number (e.g. "306974015758")
|
||||
.id → wamid
|
||||
.type → "text"
|
||||
.text.body → message content
|
||||
.timestamp → unix timestamp
|
||||
```
|
||||
For each message:
|
||||
1. Look up customer by phone number in crm_customers contacts (where type=phone or whatsapp)
|
||||
2. If found: create crm_comms_log entry (type=whatsapp, direction=inbound, ext_message_id=wamid)
|
||||
3. If not found: still log it but with customer_id="unknown:{phone}"
|
||||
|
||||
`POST /api/crm/whatsapp/send`
|
||||
Body: `{ customer_id, to_phone, message }`
|
||||
Requires auth.
|
||||
→ calls `send_whatsapp(...)`, creates outbound comms_log entry
|
||||
|
||||
## Frontend changes
|
||||
|
||||
### Update Comms tab in `CustomerDetail.jsx`
|
||||
- WhatsApp entries: green background, WhatsApp icon
|
||||
- "Send WhatsApp" button → modal with: to_phone (pre-filled from customer's whatsapp/phone contacts), message textarea
|
||||
- On send: POST `/api/crm/whatsapp/send`
|
||||
|
||||
### Update `InboxPage.jsx`
|
||||
- WhatsApp entries are already included (from crm_comms_log)
|
||||
- Add type filter option for "WhatsApp"
|
||||
|
||||
## Notes
|
||||
- Phone number format: Meta sends numbers without `+` (e.g. "306974015758"). Normalize when matching against customer contacts (strip `+` and spaces).
|
||||
- Webhook payload can contain multiple entries and messages — iterate and handle each
|
||||
- Rate limits: Meta free tier = 1000 conversations/month (a conversation = 24h window with a customer). More than enough.
|
||||
- If whatsapp_phone_number_id is empty, the send endpoint returns 503. The webhook endpoint must always be available (it's a public endpoint).
|
||||
- Media messages (images, docs): in this step, just log "Media message received" as body text. Full media download is a future enhancement.
|
||||
97
.claude/crm-step-12.md
Normal file
@@ -0,0 +1,97 @@
|
||||
# CRM Step 12 — Integration: FreePBX AMI Call Logging
|
||||
|
||||
## Context
|
||||
Read `.claude/crm-build-plan.md` for full context and IMPORTANT NOTES.
|
||||
Steps 01–11 must be complete.
|
||||
|
||||
## Prerequisites (manual setup before this step)
|
||||
- FreePBX server with AMI (Asterisk Manager Interface) enabled
|
||||
- An AMI user created in FreePBX: Admin → Asterisk Manager Users
|
||||
- Username + password (set these in config)
|
||||
- Permissions needed: read = "call,cdr" (minimum)
|
||||
- Network access from VPS to FreePBX AMI port (default: 5038)
|
||||
- Values ready: `AMI_HOST`, `AMI_PORT` (5038), `AMI_USERNAME`, `AMI_PASSWORD`
|
||||
|
||||
## Task
|
||||
Connect to FreePBX AMI over TCP, listen for call events, and auto-log them to crm_comms_log matched against customer phone numbers.
|
||||
|
||||
## Backend changes
|
||||
|
||||
### 1. Add to `backend/config.py`
|
||||
```python
|
||||
ami_host: str = ""
|
||||
ami_port: int = 5038
|
||||
ami_username: str = ""
|
||||
ami_password: str = ""
|
||||
```
|
||||
|
||||
### 2. Create `backend/crm/ami_listener.py`
|
||||
AMI uses a plain TCP socket with a text protocol (key: value\r\n pairs, events separated by \r\n\r\n).
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
from config import settings
|
||||
from mqtt import database as mqtt_db
|
||||
|
||||
async def ami_connect_and_listen():
|
||||
"""
|
||||
1. Open TCP connection to ami_host:ami_port
|
||||
2. Read the banner line
|
||||
3. Send login action:
|
||||
Action: Login\r\n
|
||||
Username: {ami_username}\r\n
|
||||
Secret: {ami_password}\r\n\r\n
|
||||
4. Read response — check for "Response: Success"
|
||||
5. Loop reading events. Parse each event block into a dict.
|
||||
6. Handle Event: Hangup:
|
||||
- CallerID: the phone number (field: CallerIDNum)
|
||||
- Duration: call duration seconds (field: Duration, may not always be present)
|
||||
- Channel direction: inbound if DestChannel starts with "PJSIP/" or "SIP/",
|
||||
outbound if Channel starts with "PJSIP/" or "SIP/"
|
||||
- Normalize CallerIDNum: strip leading + and spaces
|
||||
- Look up customer by normalized phone
|
||||
- Create crm_comms_log entry: type=call, direction=inbound|outbound,
|
||||
body=f"Call duration: {duration}s", ext_message_id=Uniqueid field
|
||||
7. On disconnect: wait 30s, reconnect. Infinite retry loop.
|
||||
"""
|
||||
|
||||
async def start_ami_listener():
|
||||
"""Entry point — only starts if ami_host is set."""
|
||||
if not settings.ami_host:
|
||||
return
|
||||
asyncio.create_task(ami_connect_and_listen())
|
||||
```
|
||||
|
||||
### 3. Add to `backend/main.py` startup
|
||||
```python
|
||||
from crm.ami_listener import start_ami_listener
|
||||
# in startup():
|
||||
await start_ami_listener()
|
||||
```
|
||||
|
||||
### 4. Add manual log endpoint to `backend/crm/router.py`
|
||||
`POST /api/crm/calls/log`
|
||||
Body: `{ customer_id, direction, duration_seconds, notes, occurred_at }`
|
||||
Requires auth.
|
||||
→ create crm_comms_log entry (type=call) manually
|
||||
→ useful if auto-logging misses a call or for logging calls made outside the office
|
||||
|
||||
## Frontend changes
|
||||
|
||||
### Update Comms tab in `CustomerDetail.jsx`
|
||||
- Call entries: amber/yellow color, phone icon
|
||||
- Show duration if available (parse from body)
|
||||
- "Log Call" button → quick modal with: direction (inbound/outbound), duration (minutes + seconds), notes, occurred_at
|
||||
- On save: POST `/api/crm/calls/log`
|
||||
|
||||
### Update `InboxPage.jsx`
|
||||
- Add "Call" to type filter options
|
||||
- Call entries show customer name, direction arrow, duration
|
||||
|
||||
## Notes
|
||||
- AMI protocol reference: each event/response is a block of `Key: Value` lines terminated by `\r\n\r\n`
|
||||
- The `Hangup` event fires at end of call and includes Duration in seconds
|
||||
- CallerIDNum for inbound calls is the caller's number. For outbound it's typically the extension — may need to use `DestCallerIDNum` instead. Test against your FreePBX setup.
|
||||
- Phone matching uses the same normalization as WhatsApp step (strip `+`, spaces, leading zeros if needed)
|
||||
- If AMI connection drops (FreePBX restart, network blip), the reconnect loop handles it silently
|
||||
- This gives you: auto-logged inbound calls matched to customers, duration recorded, plus a manual log option for anything missed
|
||||
16
.claude/settings.local.json
Normal file
@@ -0,0 +1,16 @@
|
||||
{
|
||||
"permissions": {
|
||||
"allow": [
|
||||
"Bash(npm create:*)",
|
||||
"Bash(npm install:*)",
|
||||
"Bash(npm run build:*)",
|
||||
"Bash(python -c:*)",
|
||||
"Bash(npx vite build:*)",
|
||||
"Bash(wc:*)",
|
||||
"Bash(ls:*)",
|
||||
"Bash(node -c:*)",
|
||||
"Bash(npm run lint:*)",
|
||||
"Bash(python:*)"
|
||||
]
|
||||
}
|
||||
}
|
||||
@@ -25,7 +25,7 @@ DEBUG=true
|
||||
NGINX_PORT=80
|
||||
|
||||
# Local file storage (override if you want to store data elsewhere)
|
||||
SQLITE_DB_PATH=./data/database.db
|
||||
SQLITE_DB_PATH=./mqtt_data.db
|
||||
BUILT_MELODIES_STORAGE_PATH=./storage/built_melodies
|
||||
FIRMWARE_STORAGE_PATH=./storage/firmware
|
||||
|
||||
|
||||
8
.gitignore
vendored
@@ -12,11 +12,6 @@ firebase-service-account.json
|
||||
!/data/.gitkeep
|
||||
!/data/built_melodies/.gitkeep
|
||||
|
||||
# SQLite databases
|
||||
*.db
|
||||
*.db-shm
|
||||
*.db-wal
|
||||
|
||||
# Python
|
||||
__pycache__/
|
||||
*.pyc
|
||||
@@ -38,6 +33,3 @@ Thumbs.db
|
||||
.MAIN-APP-REFERENCE/
|
||||
|
||||
.project-vesper-plan.md
|
||||
|
||||
# claude
|
||||
.claude/
|
||||
@@ -1,395 +0,0 @@
|
||||
# BellSystems CP — Automation & Notification Engine Strategy
|
||||
|
||||
## Overview
|
||||
|
||||
This document defines the architecture and implementation plan for a three-tier intelligence layer built on top of the existing BellSystems Control Panel. The system consists of:
|
||||
|
||||
1. **Event Logs** — passive, timestamped record of notable system events
|
||||
2. **Notifications** — real-time or near-real-time alerts surfaced in the UI
|
||||
3. **Automation Rules** — trigger → condition → action pipelines, configurable via UI
|
||||
|
||||
The existing tech stack is unchanged: **FastAPI + SQLite (aiosqlite) + Firestore + React**. Everything new slots in as additional tables in `mqtt_data.db`, new backend modules, and new frontend pages/components.
|
||||
|
||||
---
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
┌──────────────────────────────────────────────────┐
|
||||
│ Scheduler Loop (runs inside existing FastAPI │
|
||||
│ startup, alongside email_sync_loop) │
|
||||
│ │
|
||||
│ Every 60s: evaluate_rules() │
|
||||
│ ↓ │
|
||||
│ Rules Engine │
|
||||
│ → loads enabled rules from DB │
|
||||
│ → evaluates conditions against live data │
|
||||
│ → fires Action Executor on match │
|
||||
│ │
|
||||
│ Action Executor │
|
||||
│ → create_event_log() │
|
||||
│ → create_notification() │
|
||||
│ → send_email() (existing) │
|
||||
│ → mqtt_publish_command() (existing) │
|
||||
│ → update_field() │
|
||||
└──────────────────────────────────────────────────┘
|
||||
↕ REST / WebSocket
|
||||
┌──────────────────────────────────────────────────┐
|
||||
│ React Frontend │
|
||||
│ - Bell icon in Header (unread count badge) │
|
||||
│ - Notifications dropdown/panel │
|
||||
│ - /automations page (rule CRUD) │
|
||||
│ - Event Log viewer (filterable) │
|
||||
└──────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Database Schema (additions to `mqtt_data.db`)
|
||||
|
||||
### `event_log`
|
||||
Permanent, append-only record of things that happened.
|
||||
|
||||
```sql
|
||||
CREATE TABLE IF NOT EXISTS event_log (
|
||||
id TEXT PRIMARY KEY,
|
||||
category TEXT NOT NULL, -- 'device' | 'crm' | 'quotation' | 'user' | 'system'
|
||||
entity_type TEXT, -- 'device' | 'customer' | 'quotation' | 'user'
|
||||
entity_id TEXT, -- the ID of the affected record
|
||||
title TEXT NOT NULL,
|
||||
detail TEXT,
|
||||
severity TEXT NOT NULL DEFAULT 'info', -- 'info' | 'warning' | 'error'
|
||||
rule_id TEXT, -- which automation rule triggered this (nullable)
|
||||
created_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
);
|
||||
CREATE INDEX IF NOT EXISTS idx_event_log_category ON event_log(category, created_at);
|
||||
CREATE INDEX IF NOT EXISTS idx_event_log_entity ON event_log(entity_type, entity_id);
|
||||
```
|
||||
|
||||
### `notifications`
|
||||
Short-lived, user-facing alerts. Cleared once read or after TTL.
|
||||
|
||||
```sql
|
||||
CREATE TABLE IF NOT EXISTS notifications (
|
||||
id TEXT PRIMARY KEY,
|
||||
title TEXT NOT NULL,
|
||||
body TEXT,
|
||||
link TEXT, -- optional frontend route, e.g. "/crm/customers/abc123"
|
||||
severity TEXT NOT NULL DEFAULT 'info', -- 'info' | 'warning' | 'error' | 'success'
|
||||
is_read INTEGER NOT NULL DEFAULT 0,
|
||||
rule_id TEXT,
|
||||
entity_type TEXT,
|
||||
entity_id TEXT,
|
||||
created_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
);
|
||||
CREATE INDEX IF NOT EXISTS idx_notifications_read ON notifications(is_read, created_at);
|
||||
```
|
||||
|
||||
### `automation_rules`
|
||||
Stores user-defined rules. Evaluated by the scheduler.
|
||||
|
||||
```sql
|
||||
CREATE TABLE IF NOT EXISTS automation_rules (
|
||||
id TEXT PRIMARY KEY,
|
||||
name TEXT NOT NULL,
|
||||
description TEXT,
|
||||
enabled INTEGER NOT NULL DEFAULT 1,
|
||||
trigger_type TEXT NOT NULL, -- 'schedule' | 'mqtt_alert' | 'email_received'
|
||||
trigger_config TEXT NOT NULL DEFAULT '{}', -- JSON
|
||||
conditions TEXT NOT NULL DEFAULT '[]', -- JSON array of condition objects
|
||||
actions TEXT NOT NULL DEFAULT '[]', -- JSON array of action objects
|
||||
cooldown_hours REAL NOT NULL DEFAULT 0, -- min hours between firing on same entity
|
||||
last_run_at TEXT,
|
||||
run_count INTEGER NOT NULL DEFAULT 0,
|
||||
created_at TEXT NOT NULL DEFAULT (datetime('now')),
|
||||
updated_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
);
|
||||
```
|
||||
|
||||
### `automation_run_log`
|
||||
Deduplication and audit trail for rule executions.
|
||||
|
||||
```sql
|
||||
CREATE TABLE IF NOT EXISTS automation_run_log (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
rule_id TEXT NOT NULL,
|
||||
entity_type TEXT,
|
||||
entity_id TEXT,
|
||||
status TEXT NOT NULL, -- 'fired' | 'skipped_cooldown' | 'error'
|
||||
detail TEXT,
|
||||
fired_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
);
|
||||
CREATE INDEX IF NOT EXISTS idx_run_log_rule ON automation_run_log(rule_id, fired_at);
|
||||
CREATE INDEX IF NOT EXISTS idx_run_log_entity ON automation_run_log(entity_type, entity_id, fired_at);
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Backend Module: `automation/`
|
||||
|
||||
New module at `backend/automation/`, registered in `main.py`.
|
||||
|
||||
```
|
||||
backend/automation/
|
||||
├── __init__.py
|
||||
├── router.py # CRUD for rules, event_log GET, notifications GET/PATCH
|
||||
├── models.py # Pydantic schemas for rules, conditions, actions
|
||||
├── engine.py # evaluate_rules(), condition evaluators, action executors
|
||||
├── scheduler.py # automation_loop() async task, wired into main.py startup
|
||||
└── database.py # DB helpers for all 4 new tables
|
||||
```
|
||||
|
||||
### Wiring into `main.py`
|
||||
|
||||
```python
|
||||
from automation.router import router as automation_router
|
||||
from automation.scheduler import automation_loop
|
||||
|
||||
app.include_router(automation_router)
|
||||
|
||||
# In startup():
|
||||
asyncio.create_task(automation_loop())
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Rule Object Structure (JSON, stored in DB)
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "rule_abc123",
|
||||
"name": "Quotation follow-up after 7 days",
|
||||
"enabled": true,
|
||||
"trigger_type": "schedule",
|
||||
"trigger_config": { "interval_hours": 24 },
|
||||
"conditions": [
|
||||
{ "entity": "quotation", "field": "status", "op": "eq", "value": "sent" },
|
||||
{ "entity": "quotation", "field": "days_since_updated", "op": "gte", "value": 7 },
|
||||
{ "entity": "quotation", "field": "has_reply", "op": "eq", "value": false }
|
||||
],
|
||||
"actions": [
|
||||
{
|
||||
"type": "send_email",
|
||||
"template_key": "quotation_followup",
|
||||
"to": "{{quotation.client_email}}",
|
||||
"subject": "Following up on Quotation {{quotation.quotation_number}}",
|
||||
"body": "Hi {{customer.name}}, did you have a chance to review our quotation?"
|
||||
},
|
||||
{
|
||||
"type": "create_notification",
|
||||
"title": "Follow-up sent to {{customer.name}}",
|
||||
"link": "/crm/customers/{{quotation.customer_id}}"
|
||||
},
|
||||
{
|
||||
"type": "create_event_log",
|
||||
"category": "quotation",
|
||||
"severity": "info",
|
||||
"title": "Auto follow-up sent for {{quotation.quotation_number}}"
|
||||
}
|
||||
],
|
||||
"cooldown_hours": 168
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Supported Trigger Types
|
||||
|
||||
| Trigger | How it works |
|
||||
|---|---|
|
||||
| `schedule` | Evaluated every N hours by the background loop |
|
||||
| `mqtt_alert` | Fires immediately when `_handle_alerts()` in `mqtt/logger.py` upserts an alert — hook into that function |
|
||||
| `email_received` | Fires inside `sync_emails()` in `crm/email_sync.py` after a new inbound email is stored |
|
||||
|
||||
> **Note:** `mqtt_alert` and `email_received` triggers bypass the scheduler loop — they are called directly from the relevant handler functions, giving near-real-time response.
|
||||
|
||||
---
|
||||
|
||||
## Supported Condition Operators
|
||||
|
||||
| op | Meaning |
|
||||
|---|---|
|
||||
| `eq` | equals |
|
||||
| `neq` | not equals |
|
||||
| `gt` / `gte` / `lt` / `lte` | numeric comparisons |
|
||||
| `contains` | string contains |
|
||||
| `is_null` / `not_null` | field presence |
|
||||
| `days_since` | computed: (now - field_datetime) in days |
|
||||
|
||||
---
|
||||
|
||||
## Supported Action Types
|
||||
|
||||
| Action | What it does | Notes |
|
||||
|---|---|---|
|
||||
| `create_event_log` | Writes to `event_log` table | Always safe to fire |
|
||||
| `create_notification` | Writes to `notifications` table | Surfaces in UI bell icon |
|
||||
| `send_email` | Calls existing `crm.email_sync.send_email()` | Uses existing mail accounts |
|
||||
| `update_field` | Updates a field on an entity in DB/Firestore | Use carefully — define allowed fields explicitly |
|
||||
| `mqtt_publish` | Calls `mqtt_manager.publish_command()` | For device auto-actions |
|
||||
| `webhook` | HTTP POST to an external URL | Future / optional |
|
||||
|
||||
---
|
||||
|
||||
## Notification System (Frontend)
|
||||
|
||||
### Bell Icon in Header
|
||||
|
||||
- Polling endpoint: `GET /api/notifications?unread=true&limit=20`
|
||||
- Poll interval: 30 seconds (or switch to WebSocket push — the WS infrastructure already exists via `mqtt_manager`)
|
||||
- Badge shows unread count
|
||||
- Click opens a dropdown panel listing recent notifications with title, time, severity color, and optional link
|
||||
|
||||
### Notification Panel
|
||||
- Mark as read: `PATCH /api/notifications/{id}/read`
|
||||
- Mark all read: `PATCH /api/notifications/read-all`
|
||||
- Link field navigates to the relevant page on click
|
||||
|
||||
### Toast Popups (optional, Phase 3 polish)
|
||||
- Triggered by polling detecting new unread notifications since last check
|
||||
- Use an existing toast component if one exists, otherwise add a lightweight one
|
||||
|
||||
---
|
||||
|
||||
## Automation Rules UI (`/automations`)
|
||||
|
||||
A new sidebar entry under Settings (sysadmin/admin only).
|
||||
|
||||
### Rule List Page
|
||||
- Table: name, enabled toggle, trigger type, last run, run count, edit/delete
|
||||
- "New Rule" button
|
||||
|
||||
### Rule Editor (modal or full page)
|
||||
- **Name & description** — free text
|
||||
- **Trigger** — dropdown: Schedule / MQTT Alert / Email Received
|
||||
- Schedule: interval hours input
|
||||
- MQTT Alert: subsystem filter (optional)
|
||||
- Email Received: from address filter (optional)
|
||||
- **Conditions** — dynamic list, each row:
|
||||
- Entity selector (Quotation / Device / Customer / User)
|
||||
- Field selector (populated based on entity)
|
||||
- Operator dropdown
|
||||
- Value input
|
||||
- **Actions** — dynamic list, each row:
|
||||
- Action type dropdown
|
||||
- Type-specific fields (to address, subject, body for email; notification title/body; etc.)
|
||||
- Template variables hint: `{{quotation.quotation_number}}`, `{{customer.name}}`, etc.
|
||||
- **Cooldown** — hours between firings on the same entity
|
||||
- **Enabled** toggle
|
||||
|
||||
### Rule Run History
|
||||
- Per-rule log: when it fired, on which entity, success/error
|
||||
|
||||
---
|
||||
|
||||
## Event Log UI
|
||||
|
||||
Accessible from `/event-log` route, linked from Dashboard.
|
||||
|
||||
- Filterable by: category, severity, entity type, date range
|
||||
- Columns: time, category, severity badge, title, entity link
|
||||
- Append-only (no deletion from UI)
|
||||
- Retention: purge entries older than configurable days (e.g. 180 days) via the existing `purge_loop` pattern in `mqtt/database.py`
|
||||
|
||||
---
|
||||
|
||||
## Pre-Built Rules (Seeded on First Run, All Disabled)
|
||||
|
||||
These are created on first startup — the admin enables and customizes them.
|
||||
|
||||
| Rule | Trigger | Condition | Action |
|
||||
|---|---|---|---|
|
||||
| Quotation follow-up | Schedule 24h | status=sent AND days_since_updated ≥ 7 AND no reply | Send follow-up email + notify |
|
||||
| Device offline warning | Schedule 1h | no heartbeat for > 2h | Create notification + event log |
|
||||
| New unknown email | email_received | customer_id IS NULL | Create notification |
|
||||
| Subscription expiring soon | Schedule 24h | subscription.expiry_date within 7 days | Notify + send email |
|
||||
| Device critical alert | mqtt_alert | state = CRITICAL | Notify + event log + optional MQTT restart |
|
||||
| Quotation expired | Schedule 24h | status=sent AND days_since_updated ≥ 30 | Update status → expired + notify |
|
||||
|
||||
---
|
||||
|
||||
## Implementation Phases
|
||||
|
||||
### Phase 1 — Foundation (DB + API)
|
||||
- [ ] Add 4 new tables to `mqtt/database.py` schema + migrations
|
||||
- [ ] Create `automation/database.py` with all DB helpers
|
||||
- [ ] Create `automation/models.py` — Pydantic schemas for rules, conditions, actions, notifications, event_log
|
||||
- [ ] Create `automation/router.py` — CRUD for rules, GET event_log, GET/PATCH notifications
|
||||
- [ ] Wire router into `main.py`
|
||||
|
||||
### Phase 2 — Rules Engine + Scheduler
|
||||
- [ ] Create `automation/engine.py` — condition evaluator, template renderer, action executor
|
||||
- [ ] Create `automation/scheduler.py` — `automation_loop()` async task
|
||||
- [ ] Hook `email_received` trigger into `crm/email_sync.sync_emails()`
|
||||
- [ ] Hook `mqtt_alert` trigger into `mqtt/logger._handle_alerts()`
|
||||
- [ ] Seed pre-built (disabled) rules on first startup
|
||||
- [ ] Wire `automation_loop()` into `main.py` startup
|
||||
|
||||
### Phase 3 — Notification UI
|
||||
- [ ] Bell icon with unread badge in `Header.jsx`
|
||||
- [ ] Notifications dropdown panel component
|
||||
- [ ] 30s polling hook in React
|
||||
- [ ] Mark read / mark all read
|
||||
|
||||
### Phase 4 — Automation Rules UI
|
||||
- [ ] `/automations` route and rule list page
|
||||
- [ ] Rule editor form (conditions + actions dynamic builder)
|
||||
- [ ] Enable/disable toggle
|
||||
- [ ] Run history per rule
|
||||
- [ ] Add "Automations" entry to Sidebar under Settings
|
||||
|
||||
### Phase 5 — Event Log UI
|
||||
- [ ] `/event-log` route with filterable table
|
||||
- [ ] Purge policy wired into existing `purge_loop`
|
||||
- [ ] Dashboard widget showing recent high-severity events
|
||||
|
||||
### Phase 6 — Polish
|
||||
- [ ] Toast notifications on new unread detection
|
||||
- [ ] Template variable previewer in rule editor
|
||||
- [ ] "Run now" button per rule (for testing without waiting for scheduler)
|
||||
- [ ] Named email templates stored in DB (reusable across rules)
|
||||
|
||||
---
|
||||
|
||||
## Key Design Decisions
|
||||
|
||||
| Decision | Choice | Reason |
|
||||
|---|---|---|
|
||||
| Storage | SQLite (same `mqtt_data.db`) | Consistent with existing pattern; no new infra |
|
||||
| Scheduler | `asyncio` task in FastAPI startup | Same pattern as `email_sync_loop` and `purge_loop` already in `main.py` |
|
||||
| Rule format | JSON columns in DB | Flexible, UI-editable, no schema migrations per new rule type |
|
||||
| Template variables | `{{entity.field}}` string interpolation | Simple to implement, readable in UI |
|
||||
| Cooldown dedup | `automation_run_log` per (rule_id, entity_id) | Prevents repeat firing on same quotation/device within cooldown window |
|
||||
| Notification delivery | DB polling (30s) initially | The WS infra exists (`mqtt_manager._ws_subscribers`) — easy to upgrade later |
|
||||
| Pre-built rules | Seeded as disabled | Non-intrusive — admin must consciously enable each one |
|
||||
| `update_field` safety | Explicit allowlist of permitted fields | Prevents accidental data corruption from misconfigured rules |
|
||||
|
||||
---
|
||||
|
||||
## Template Variables Reference
|
||||
|
||||
Available inside action `body`, `subject`, `title`, `link` fields:
|
||||
|
||||
| Variable | Source |
|
||||
|---|---|
|
||||
| `{{customer.name}}` | Firestore `crm_customers` |
|
||||
| `{{customer.organization}}` | Firestore `crm_customers` |
|
||||
| `{{quotation.quotation_number}}` | SQLite `crm_quotations` |
|
||||
| `{{quotation.final_total}}` | SQLite `crm_quotations` |
|
||||
| `{{quotation.status}}` | SQLite `crm_quotations` |
|
||||
| `{{quotation.client_email}}` | SQLite `crm_quotations` |
|
||||
| `{{device.serial}}` | Firestore `devices` |
|
||||
| `{{device.label}}` | Firestore `devices` |
|
||||
| `{{alert.subsystem}}` | MQTT alert payload |
|
||||
| `{{alert.state}}` | MQTT alert payload |
|
||||
| `{{user.email}}` | Firestore `users` |
|
||||
|
||||
---
|
||||
|
||||
## Notes
|
||||
|
||||
- `crm/email_sync.send_email()` is reused as-is for the `send_email` action type. The engine constructs the call parameters.
|
||||
- `update_field` actions start with an allowlist of: `quotation.status`, `user.status`. Expand deliberately.
|
||||
- For MQTT auto-restart, `mqtt_manager.publish_command(serial, "restart", {})` already works — the engine just calls it.
|
||||
- Firestore is read-only from the automation engine (for customer/device lookups). All writes go to SQLite, consistent with the existing architecture.
|
||||
- The `has_reply` condition on quotations is computed by checking whether any `crm_comms_log` entry exists with `direction='inbound'` and `customer_id` matching the quotation's customer, dated after the quotation's `updated_at`.
|
||||
@@ -1,404 +0,0 @@
|
||||
# CRM Customer Status System — Implementation Plan
|
||||
|
||||
## Context
|
||||
|
||||
This project is a Vue/React + FastAPI + Firestore admin console located at `C:\development\bellsystems-cp`.
|
||||
|
||||
The frontend lives in `frontend/src/` and the backend in `backend/`.
|
||||
The CRM module is at `frontend/src/crm/` and `backend/crm/`.
|
||||
|
||||
Currently, customers have two flat boolean flags on their Firestore document:
|
||||
- `negotiating: bool`
|
||||
- `has_problem: bool`
|
||||
|
||||
These need to be replaced with a richer, structured system as described below.
|
||||
|
||||
---
|
||||
|
||||
## 1. Target Data Model
|
||||
|
||||
### 1A. On the Customer Document (`customers/{id}`)
|
||||
|
||||
Remove `negotiating` and `has_problem`. Add the following:
|
||||
|
||||
```
|
||||
relationship_status: string
|
||||
— one of: "lead" | "prospect" | "active" | "inactive" | "churned"
|
||||
— default: "lead"
|
||||
|
||||
technical_issues: array of {
|
||||
active: bool,
|
||||
opened_date: Firestore Timestamp,
|
||||
resolved_date: Firestore Timestamp | null,
|
||||
note: string,
|
||||
opened_by: string, ← display name or user ID of staff member
|
||||
resolved_by: string | null
|
||||
}
|
||||
|
||||
install_support: array of {
|
||||
active: bool,
|
||||
opened_date: Firestore Timestamp,
|
||||
resolved_date: Firestore Timestamp | null,
|
||||
note: string,
|
||||
opened_by: string,
|
||||
resolved_by: string | null
|
||||
}
|
||||
|
||||
transaction_history: array of {
|
||||
date: Firestore Timestamp,
|
||||
flow: string, ← "invoice" | "payment" | "refund" | "credit"
|
||||
payment_type: string | null, ← "cash" | "bank_transfer" | "card" | "paypal" — null for invoices
|
||||
category: string, ← "full_payment" | "advance" | "installment"
|
||||
amount: number,
|
||||
currency: string, ← default "EUR"
|
||||
invoice_ref: string | null,
|
||||
order_ref: string | null, ← references an order document ID, nullable
|
||||
recorded_by: string,
|
||||
note: string
|
||||
}
|
||||
```
|
||||
|
||||
### 1B. Orders Subcollection (`customers/{id}/orders/{order_id}`)
|
||||
|
||||
Orders live **exclusively** as a subcollection under each customer. There is no top-level `orders`
|
||||
collection. The existing top-level `orders` collection in Firestore and its corresponding backend
|
||||
routes should be **removed entirely** and replaced with subcollection-based routes under
|
||||
`/crm/customers/{customer_id}/orders/`.
|
||||
|
||||
If cross-customer order querying is ever needed in the future, use Firestore's native
|
||||
`collectionGroup("orders")` query — no top-level mirror collection is required.
|
||||
|
||||
Each order document carries the following fields:
|
||||
|
||||
```
|
||||
order_number: string ← e.g. "ORD-2026-041" (already exists — keep)
|
||||
title: string ← NEW: human-readable name e.g. "3x Wall Mount Units - Athens Office"
|
||||
created_by: string ← NEW: staff user ID or display name
|
||||
|
||||
status: string ← REPLACE existing OrderStatus enum with new values:
|
||||
— "negotiating" | "awaiting_quotation" | "awaiting_customer_confirmation"
|
||||
| "awaiting_fulfilment" | "awaiting_payment" | "manufacturing"
|
||||
| "shipped" | "installed" | "declined" | "complete"
|
||||
|
||||
status_updated_date: Firestore Timestamp ← NEW
|
||||
status_updated_by: string ← NEW
|
||||
|
||||
payment_status: object { ← NEW — replaces the flat PaymentStatus enum
|
||||
required_amount: number,
|
||||
received_amount: number, ← computed from transaction_history where order_ref matches
|
||||
balance_due: number, ← computed: required_amount - received_amount
|
||||
advance_required: bool,
|
||||
advance_amount: number | null,
|
||||
payment_complete: bool
|
||||
}
|
||||
|
||||
timeline: array of { ← NEW — order event log
|
||||
date: Firestore Timestamp,
|
||||
type: string, ← "quote_request" | "quote_sent" | "quote_accepted" | "quote_declined"
|
||||
| "mfg_started" | "mfg_complete" | "order_shipped" | "installed"
|
||||
| "payment_received" | "invoice_sent" | "note"
|
||||
note: string,
|
||||
updated_by: string
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 2. Backend Changes
|
||||
|
||||
### 2A. `backend/crm/models.py`
|
||||
|
||||
- **Remove** `negotiating: bool` and `has_problem: bool` from `CustomerCreate` and `CustomerUpdate`.
|
||||
- **Add** `relationship_status: Optional[str] = "lead"` to `CustomerCreate` and `CustomerUpdate`.
|
||||
- **Add** `technical_issues: List[dict] = []` to `CustomerCreate` and `CustomerUpdate`.
|
||||
- **Add** `install_support: List[dict] = []` to `CustomerCreate` and `CustomerUpdate`.
|
||||
- **Add** `transaction_history: List[dict] = []` to `CustomerCreate` and `CustomerUpdate`.
|
||||
- **Add** proper Pydantic models for each of the above array item shapes:
|
||||
- `TechnicalIssue` model
|
||||
- `InstallSupportEntry` model
|
||||
- `TransactionEntry` model
|
||||
- **Update** `OrderStatus` enum with the new values:
|
||||
`negotiating`, `awaiting_quotation`, `awaiting_customer_confirmation`,
|
||||
`awaiting_fulfilment`, `awaiting_payment`, `manufacturing`,
|
||||
`shipped`, `installed`, `declined`, `complete`
|
||||
- **Replace** the flat `PaymentStatus` enum on `OrderCreate` / `OrderUpdate` with a new `OrderPaymentStatus` Pydantic model matching the structure above.
|
||||
- **Add** `title: Optional[str]`, `created_by: Optional[str]`, `status_updated_date: Optional[str]`,
|
||||
`status_updated_by: Optional[str]`, and `timeline: List[dict] = []` to `OrderCreate` and `OrderUpdate`.
|
||||
|
||||
### 2B. `backend/crm/customers_router.py`
|
||||
|
||||
- Update any route that reads/writes `negotiating` or `has_problem` to use the new fields.
|
||||
- Add new dedicated endpoints:
|
||||
|
||||
```
|
||||
POST /crm/customers/{id}/technical-issues
|
||||
— body: { note: str, opened_by: str }
|
||||
— appends a new active issue to the array
|
||||
|
||||
PATCH /crm/customers/{id}/technical-issues/{index}/resolve
|
||||
— body: { resolved_by: str }
|
||||
— sets active=false and resolved_date=now on the item at that index
|
||||
|
||||
POST /crm/customers/{id}/install-support
|
||||
— same pattern as technical-issues above
|
||||
|
||||
PATCH /crm/customers/{id}/install-support/{index}/resolve
|
||||
— same as technical-issues resolve
|
||||
|
||||
POST /crm/customers/{id}/transactions
|
||||
— body: TransactionEntry (see model above)
|
||||
— appends to transaction_history
|
||||
|
||||
PATCH /crm/customers/{id}/relationship-status
|
||||
— body: { status: str }
|
||||
— updates relationship_status field
|
||||
```
|
||||
|
||||
### 2C. `backend/crm/orders_router.py`
|
||||
|
||||
- **Remove** all top-level `/crm/orders/` routes entirely.
|
||||
- Re-implement all order CRUD under `/crm/customers/{customer_id}/orders/`:
|
||||
|
||||
```
|
||||
GET /crm/customers/{customer_id}/orders/
|
||||
POST /crm/customers/{customer_id}/orders/
|
||||
GET /crm/customers/{customer_id}/orders/{order_id}
|
||||
PATCH /crm/customers/{customer_id}/orders/{order_id}
|
||||
DELETE /crm/customers/{customer_id}/orders/{order_id}
|
||||
```
|
||||
|
||||
- Add endpoint to append a timeline event:
|
||||
|
||||
```
|
||||
POST /crm/customers/{customer_id}/orders/{order_id}/timeline
|
||||
— body: { type: str, note: str, updated_by: str }
|
||||
— appends to the timeline array and updates status_updated_date + status_updated_by
|
||||
```
|
||||
|
||||
- Add endpoint to update payment status:
|
||||
|
||||
```
|
||||
PATCH /crm/customers/{customer_id}/orders/{order_id}/payment-status
|
||||
— body: OrderPaymentStatus fields (partial update allowed)
|
||||
```
|
||||
|
||||
- Add a dedicated "Init Negotiations" endpoint:
|
||||
|
||||
```
|
||||
POST /crm/customers/{customer_id}/orders/init-negotiations
|
||||
— body: { title: str, note: str, date: datetime, created_by: str }
|
||||
— creates a new order with status="negotiating", auto-fills all other fields
|
||||
— simultaneously updates the customer's relationship_status to "active"
|
||||
(only if currently "lead" or "prospect" — do not downgrade an already "active" customer)
|
||||
— returns the newly created order document
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 3. Frontend Changes
|
||||
|
||||
### 3A. `frontend/src/crm/customers/CustomerList.jsx`
|
||||
|
||||
- When Notes: Quick filter is set, replace the `negotiating` and `has_problem` boolean badge display in the Status column with:
|
||||
- A **relationship status chip** (color-coded pill: lead=grey, prospect=blue, active=green, inactive=amber, churned=soft red)
|
||||
- A small **red dot / warning icon** if `technical_issues.some(i => i.active)` is true, under a new "Support" column. Add this column to the list of arrangeable and toggleable columns.
|
||||
- A small **amber dot / support icon** if `install_support.some(i => i.active)` is true, under the same "Support" column.
|
||||
- These are derived from the arrays — do not store a separate boolean on the document.
|
||||
- When Notes: Expanded filter is set, replace the `negotiating` and `has_problem` verbose displays with the active order status (if any) in this format:
|
||||
`"<Status Label> — <Date> — <Note>"` e.g. `"Negotiating — 24.03.26 — Customer requested a more affordable quotation"`
|
||||
|
||||
### 3B. `frontend/src/crm/customers/CustomerDetail.jsx`
|
||||
|
||||
The customer detail page currently has a tab structure: Overview, Orders, Quotations, Communication, Files & Media, Devices.
|
||||
|
||||
Make the following changes:
|
||||
|
||||
#### Whole page
|
||||
- On the top of the page where we display the name, organization and full address, change it to:
|
||||
Line 1: `Full Title + Name + Surname`
|
||||
Line 2: `Organization · City` (city only, not full address)
|
||||
- Remove the horizontal separation line after the title and before the tabs.
|
||||
- On the top right side, there is an Edit Customer button. To its left, add **3 new buttons** in this
|
||||
order (left → right): **Init Negotiations**, **Record Issue/Support**, **Record Payment**, then
|
||||
the existing Edit button. All 4 buttons are the same size. Add solid single-color icons to each.
|
||||
|
||||
**"Init Negotiations" button** (blue/indigo accent):
|
||||
- Opens a mini modal.
|
||||
- Fields: Date (defaults to NOW), Title (text input, required), Note (textarea, optional).
|
||||
- Auto-filled server-side: `status = "negotiating"`, `created_by` = current user,
|
||||
`status_updated_date` = now, `status_updated_by` = current user,
|
||||
`payment_status` defaults to zeroed object.
|
||||
- On confirm: calls `POST /crm/customers/{id}/orders/init-negotiations`.
|
||||
- After success: refreshes customer data and orders list. The customer's `relationship_status`
|
||||
is set to `"active"` server-side — no separate frontend call needed.
|
||||
- This is a fast-entry shortcut only. All subsequent edits to this order happen via the Orders tab.
|
||||
|
||||
**"Record Issue/Support" button** (amber/orange accent):
|
||||
- Opens a mini modal.
|
||||
- At the top: a **2-button toggle selector** (not a dropdown) to choose: `Technical Issue` | `Install Support`.
|
||||
- Fields: Date (defaults to NOW), Note (textarea, required).
|
||||
- On confirm: calls `POST /crm/customers/{id}/technical-issues` or
|
||||
`POST /crm/customers/{id}/install-support` depending on selection.
|
||||
|
||||
**"Record Payment" button** (green accent):
|
||||
- Opens a mini modal.
|
||||
- Fields: Date (defaults to NOW), Payment Type (cash | bank transfer | card | paypal),
|
||||
Category (full payment | advance | installment), Amount (number), Currency (defaults to EUR),
|
||||
Invoice Ref (searchable over the customer's invoices, optional),
|
||||
Order Ref (searchable/selectable from the customer's orders, optional),
|
||||
Note (textarea, optional).
|
||||
- On confirm: calls `POST /crm/customers/{id}/transactions`.
|
||||
|
||||
#### Overview Tab
|
||||
- The main hero section gets a complete overhaul — start fresh:
|
||||
- **Row 1 — Relationship Status selector**: The 5 statuses (`lead | prospect | active | inactive | churned`) as styled pill/tab buttons in a row. Current status is highlighted with a glow effect. Color-code using global CSS variables (add to `index.css` if not already present). Clicking a status immediately calls `PATCH /crm/customers/{id}/relationship-status`.
|
||||
- **Row 2 — Customer info**: All fields except Name and Organization (shown in page header). Include language, religion, tags, etc.
|
||||
- **Row 3 — Contacts**: All contact entries (phone, email, WhatsApp, etc.).
|
||||
- **Row 4 — Notes**: Responsive column grid. 1 column below 1100px, 2 columns 1100–2000px, 3 columns above 2000px. Masonry/wrap layout with no gaps between note cards.
|
||||
- Move the Latest Orders section to just below the hero section, before Latest Communications.
|
||||
Hide this section entirely if no orders exist for this customer.
|
||||
- For all other sections (Latest Communications, Latest Quotations, Devices): hide each section
|
||||
entirely if it has no data. Show dynamically when data exists.
|
||||
|
||||
#### New "Support" Tab (add to TABS array, after Overview)
|
||||
Two full-width section cards:
|
||||
|
||||
**Technical Issues Card**
|
||||
- Header shows active count badge (e.g. "2 active")
|
||||
- All issues listed, newest first (active and resolved)
|
||||
- Each row: colored status dot, opened date, note, opened_by — "Resolve" button if active
|
||||
- If more than 5 items: list is scrollable (fixed max-height), does not expand the page
|
||||
- "Report New Issue" button → small inline form with note field + submit
|
||||
|
||||
**Install Support Card**
|
||||
- Identical structure to Technical Issues card
|
||||
- Same scrollable behavior if more than 5 items
|
||||
|
||||
#### New "Financials" Tab (add to TABS array, after Support)
|
||||
Two sections:
|
||||
|
||||
**Active Order Payment Status** (shown only if an active order exists)
|
||||
- required_amount, received_amount, balance_due
|
||||
- Advance required indicator + advance amount if applicable
|
||||
- Payment complete indicator
|
||||
|
||||
**Transaction History**
|
||||
- Ledger table: Date | Flow | Amount | Currency | Method | Category | Order Ref | Invoice Ref | Note | Recorded By | Actions
|
||||
- "Add Transaction" button → modal with all TransactionEntry fields
|
||||
- Totals row: Total Invoiced vs Total Paid vs Outstanding Balance
|
||||
- Each row: right-aligned **Actions** button (consistent with other tables in the project)
|
||||
with options: **Edit** (opens edit form) and **Delete** (requires confirmation dialog)
|
||||
|
||||
#### Orders Tab (existing — update in place)
|
||||
- Each order card/row shows:
|
||||
- `title` as primary heading
|
||||
- `status` with human-readable label and color coding (see Section 4)
|
||||
- `payment_status` summary: required / received / balance due
|
||||
- **"View Timeline"** toggle: expands a vertical event log below the order card
|
||||
- **"Add Timeline Event"** button: small inline form with type dropdown + note field
|
||||
- Update all API calls to use `/crm/customers/{customer_id}/orders/` routes.
|
||||
|
||||
### 3C. `frontend/src/crm/customers/CustomerForm.jsx`
|
||||
|
||||
- Remove `negotiating` and `has_problem` fields.
|
||||
- Add `relationship_status` dropdown (default: `"lead"`).
|
||||
- No issue/transaction forms needed here — managed from the detail page.
|
||||
|
||||
### 3D. `frontend/src/crm/orders/OrderForm.jsx` and `OrderDetail.jsx`
|
||||
|
||||
- Update status dropdown with new values and labels:
|
||||
- `negotiating` → "Negotiating"
|
||||
- `awaiting_quotation` → "Awaiting Quotation"
|
||||
- `awaiting_customer_confirmation` → "Awaiting Customer Confirmation"
|
||||
- `awaiting_fulfilment` → "Awaiting Fulfilment"
|
||||
- `awaiting_payment` → "Awaiting Payment"
|
||||
- `manufacturing` → "Manufacturing"
|
||||
- `shipped` → "Shipped"
|
||||
- `installed` → "Installed"
|
||||
- `declined` → "Declined"
|
||||
- `complete` → "Complete"
|
||||
- Add `title` input field (required).
|
||||
- Replace flat `payment_status` enum with the new `payment_status` object fields.
|
||||
- Add Timeline section to `OrderDetail.jsx`: vertical event log + add-entry inline form.
|
||||
- Update all API calls to use `/crm/customers/{customer_id}/orders/` routes.
|
||||
|
||||
---
|
||||
|
||||
## 4. Status Color Coding Reference
|
||||
|
||||
Define all as CSS variables in `index.css` and use consistently across all views:
|
||||
|
||||
### Relationship Status
|
||||
| Status | Color |
|
||||
|---|---|
|
||||
| lead | grey / muted |
|
||||
| prospect | blue |
|
||||
| active | green |
|
||||
| inactive | amber |
|
||||
| churned | dark or soft red |
|
||||
|
||||
### Order Status
|
||||
| Status | Color |
|
||||
|---|---|
|
||||
| negotiating | blue |
|
||||
| awaiting_quotation | purple |
|
||||
| awaiting_customer_confirmation | indigo |
|
||||
| awaiting_fulfilment | amber |
|
||||
| awaiting_payment | orange |
|
||||
| manufacturing | cyan |
|
||||
| shipped | teal |
|
||||
| installed | green |
|
||||
| declined | red |
|
||||
| complete | muted/grey |
|
||||
|
||||
### Issue / Support Flags
|
||||
| State | Color |
|
||||
|---|---|
|
||||
| active issue | red |
|
||||
| active support | amber |
|
||||
| resolved | muted/grey |
|
||||
|
||||
---
|
||||
|
||||
## 5. Migration Notes
|
||||
|
||||
- The old `negotiating` and `has_problem` fields will remain in Firestore until the migration script is run. The backend should **read both old and new fields** during the transition period, preferring the new structure if present.
|
||||
- A one-time migration script (`backend/migrate_customer_flags.py`) should:
|
||||
1. Read all customer documents
|
||||
2. If `negotiating: true` → create an order in the customer's `orders` subcollection with `status = "negotiating"` and set `relationship_status = "active"` on the customer
|
||||
3. If `has_problem: true` → append one entry to `technical_issues` with `active: true`, `opened_date: customer.updated_at`, `note: "Migrated from legacy has_problem flag"`, `opened_by: "system"`
|
||||
4. Remove `negotiating` and `has_problem` from the customer document
|
||||
- Do **not** run the migration script until all frontend and backend changes are deployed and tested.
|
||||
|
||||
---
|
||||
|
||||
## 6. File Summary — What to Touch
|
||||
|
||||
```
|
||||
backend/crm/models.py ← model updates (primary changes)
|
||||
backend/crm/customers_router.py ← new endpoints + field updates
|
||||
backend/crm/orders_router.py ← remove top-level routes, re-implement as subcollection,
|
||||
add timeline + payment-status + init-negotiations endpoints
|
||||
backend/migrate_customer_flags.py ← NEW one-time migration script
|
||||
|
||||
frontend/src/index.css ← add CSS variables for all new status colors
|
||||
frontend/src/crm/customers/CustomerList.jsx ← relationship status chip + support flag dots column
|
||||
frontend/src/crm/customers/CustomerDetail.jsx ← page header, 3 new quick-entry buttons + modals,
|
||||
Overview tab overhaul, new Support tab,
|
||||
new Financials tab, Orders tab updates
|
||||
frontend/src/crm/customers/CustomerForm.jsx ← remove old flags, add relationship_status
|
||||
frontend/src/crm/orders/OrderForm.jsx ← new status values, title field, payment_status,
|
||||
updated API route paths
|
||||
frontend/src/crm/orders/OrderDetail.jsx ← timeline section, updated status/payment,
|
||||
updated API route paths
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 7. Do NOT Change (out of scope)
|
||||
|
||||
- Quotations system — leave as-is
|
||||
- Communications / inbox — leave as-is
|
||||
- Files & Media tab — leave as-is
|
||||
- Devices tab — leave as-is
|
||||
- Any other module outside `crm/`
|
||||
@@ -1,6 +1,6 @@
|
||||
FROM python:3.11-slim
|
||||
|
||||
# System dependencies: WeasyPrint (pango/cairo), ffmpeg (video thumbs), poppler (pdf2image)
|
||||
# WeasyPrint system dependencies (libpango, libcairo, etc.)
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
libpango-1.0-0 \
|
||||
libpangocairo-1.0-0 \
|
||||
@@ -8,8 +8,6 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
libffi-dev \
|
||||
shared-mime-info \
|
||||
fonts-dejavu-core \
|
||||
ffmpeg \
|
||||
poppler-utils \
|
||||
&& apt-get clean && rm -rf /var/lib/apt/lists/*
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
@@ -1,38 +1,27 @@
|
||||
import json
|
||||
import logging
|
||||
from database import get_db
|
||||
from mqtt.database import get_db
|
||||
|
||||
logger = logging.getLogger("builder.database")
|
||||
|
||||
|
||||
async def insert_built_melody(melody_id: str, name: str, pid: str, steps: str, is_builtin: bool = False) -> None:
|
||||
async def insert_built_melody(melody_id: str, name: str, pid: str, steps: str) -> None:
|
||||
db = await get_db()
|
||||
await db.execute(
|
||||
"""INSERT INTO built_melodies (id, name, pid, steps, assigned_melody_ids, is_builtin)
|
||||
VALUES (?, ?, ?, ?, ?, ?)""",
|
||||
(melody_id, name, pid, steps, json.dumps([]), 1 if is_builtin else 0),
|
||||
"""INSERT INTO built_melodies (id, name, pid, steps, assigned_melody_ids)
|
||||
VALUES (?, ?, ?, ?, ?)""",
|
||||
(melody_id, name, pid, steps, json.dumps([])),
|
||||
)
|
||||
await db.commit()
|
||||
|
||||
|
||||
async def update_built_melody(melody_id: str, name: str, pid: str, steps: str, is_builtin: bool = False) -> None:
|
||||
async def update_built_melody(melody_id: str, name: str, pid: str, steps: str) -> None:
|
||||
db = await get_db()
|
||||
await db.execute(
|
||||
"""UPDATE built_melodies
|
||||
SET name = ?, pid = ?, steps = ?, is_builtin = ?, updated_at = datetime('now')
|
||||
SET name = ?, pid = ?, steps = ?, updated_at = datetime('now')
|
||||
WHERE id = ?""",
|
||||
(name, pid, steps, 1 if is_builtin else 0, melody_id),
|
||||
)
|
||||
await db.commit()
|
||||
|
||||
|
||||
async def update_builtin_flag(melody_id: str, is_builtin: bool) -> None:
|
||||
db = await get_db()
|
||||
await db.execute(
|
||||
"""UPDATE built_melodies
|
||||
SET is_builtin = ?, updated_at = datetime('now')
|
||||
WHERE id = ?""",
|
||||
(1 if is_builtin else 0, melody_id),
|
||||
(name, pid, steps, melody_id),
|
||||
)
|
||||
await db.commit()
|
||||
|
||||
@@ -79,7 +68,6 @@ async def get_built_melody(melody_id: str) -> dict | None:
|
||||
return None
|
||||
row = dict(rows[0])
|
||||
row["assigned_melody_ids"] = json.loads(row["assigned_melody_ids"] or "[]")
|
||||
row["is_builtin"] = bool(row.get("is_builtin", 0))
|
||||
return row
|
||||
|
||||
|
||||
@@ -92,7 +80,6 @@ async def list_built_melodies() -> list[dict]:
|
||||
for row in rows:
|
||||
r = dict(row)
|
||||
r["assigned_melody_ids"] = json.loads(r["assigned_melody_ids"] or "[]")
|
||||
r["is_builtin"] = bool(r.get("is_builtin", 0))
|
||||
results.append(r)
|
||||
return results
|
||||
|
||||
|
||||
@@ -6,14 +6,12 @@ class BuiltMelodyCreate(BaseModel):
|
||||
name: str
|
||||
pid: str
|
||||
steps: str # raw step string e.g. "1,2,2+1,1,2,3+1"
|
||||
is_builtin: bool = False
|
||||
|
||||
|
||||
class BuiltMelodyUpdate(BaseModel):
|
||||
name: Optional[str] = None
|
||||
pid: Optional[str] = None
|
||||
steps: Optional[str] = None
|
||||
is_builtin: Optional[bool] = None
|
||||
|
||||
|
||||
class BuiltMelodyInDB(BaseModel):
|
||||
@@ -21,7 +19,6 @@ class BuiltMelodyInDB(BaseModel):
|
||||
name: str
|
||||
pid: str
|
||||
steps: str
|
||||
is_builtin: bool = False
|
||||
binary_path: Optional[str] = None
|
||||
binary_url: Optional[str] = None
|
||||
progmem_code: Optional[str] = None
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from fastapi.responses import FileResponse, PlainTextResponse
|
||||
from fastapi.responses import FileResponse
|
||||
from auth.models import TokenPayload
|
||||
from auth.dependencies import require_permission
|
||||
from builder.models import (
|
||||
@@ -20,7 +20,6 @@ async def list_built_melodies(
|
||||
melodies = await service.list_built_melodies()
|
||||
return BuiltMelodyListResponse(melodies=melodies, total=len(melodies))
|
||||
|
||||
|
||||
@router.get("/for-melody/{firestore_melody_id}")
|
||||
async def get_for_firestore_melody(
|
||||
firestore_melody_id: str,
|
||||
@@ -33,14 +32,6 @@ async def get_for_firestore_melody(
|
||||
return result.model_dump()
|
||||
|
||||
|
||||
@router.get("/generate-builtin-list")
|
||||
async def generate_builtin_list(
|
||||
_user: TokenPayload = Depends(require_permission("melodies", "view")),
|
||||
):
|
||||
"""Generate a C++ header with PROGMEM arrays for all is_builtin archetypes."""
|
||||
code = await service.generate_builtin_list()
|
||||
return PlainTextResponse(content=code, media_type="text/plain")
|
||||
|
||||
|
||||
@router.get("/{melody_id}", response_model=BuiltMelodyInDB)
|
||||
async def get_built_melody(
|
||||
@@ -75,15 +66,6 @@ async def delete_built_melody(
|
||||
await service.delete_built_melody(melody_id)
|
||||
|
||||
|
||||
@router.post("/{melody_id}/toggle-builtin", response_model=BuiltMelodyInDB)
|
||||
async def toggle_builtin(
|
||||
melody_id: str,
|
||||
_user: TokenPayload = Depends(require_permission("melodies", "edit")),
|
||||
):
|
||||
"""Toggle the is_builtin flag for an archetype."""
|
||||
return await service.toggle_builtin(melody_id)
|
||||
|
||||
|
||||
@router.post("/{melody_id}/build-binary", response_model=BuiltMelodyInDB)
|
||||
async def build_binary(
|
||||
melody_id: str,
|
||||
|
||||
@@ -32,7 +32,6 @@ def _row_to_built_melody(row: dict) -> BuiltMelodyInDB:
|
||||
name=row["name"],
|
||||
pid=row["pid"],
|
||||
steps=row["steps"],
|
||||
is_builtin=row.get("is_builtin", False),
|
||||
binary_path=binary_path,
|
||||
binary_url=binary_url,
|
||||
progmem_code=row.get("progmem_code"),
|
||||
@@ -152,12 +151,8 @@ async def create_built_melody(data: BuiltMelodyCreate) -> BuiltMelodyInDB:
|
||||
name=data.name,
|
||||
pid=data.pid,
|
||||
steps=data.steps,
|
||||
is_builtin=data.is_builtin,
|
||||
)
|
||||
# Auto-build binary and builtin code on creation
|
||||
result = await get_built_melody(melody_id)
|
||||
result = await _do_build(melody_id)
|
||||
return result
|
||||
return await get_built_melody(melody_id)
|
||||
|
||||
|
||||
async def update_built_melody(melody_id: str, data: BuiltMelodyUpdate) -> BuiltMelodyInDB:
|
||||
@@ -168,22 +163,11 @@ async def update_built_melody(melody_id: str, data: BuiltMelodyUpdate) -> BuiltM
|
||||
new_name = data.name if data.name is not None else row["name"]
|
||||
new_pid = data.pid if data.pid is not None else row["pid"]
|
||||
new_steps = data.steps if data.steps is not None else row["steps"]
|
||||
new_is_builtin = data.is_builtin if data.is_builtin is not None else row.get("is_builtin", False)
|
||||
|
||||
await _check_unique(new_name, new_pid or "", exclude_id=melody_id)
|
||||
|
||||
steps_changed = (data.steps is not None) and (data.steps != row["steps"])
|
||||
|
||||
await db.update_built_melody(melody_id, name=new_name, pid=new_pid, steps=new_steps, is_builtin=new_is_builtin)
|
||||
|
||||
# If steps changed, flag all assigned melodies as outdated, then rebuild
|
||||
if steps_changed:
|
||||
assigned_ids = row.get("assigned_melody_ids", [])
|
||||
if assigned_ids:
|
||||
await _flag_melodies_outdated(assigned_ids, True)
|
||||
|
||||
# Auto-rebuild binary and builtin code on every save
|
||||
return await _do_build(melody_id)
|
||||
await db.update_built_melody(melody_id, name=new_name, pid=new_pid, steps=new_steps)
|
||||
return await get_built_melody(melody_id)
|
||||
|
||||
|
||||
async def delete_built_melody(melody_id: str) -> None:
|
||||
@@ -191,11 +175,6 @@ async def delete_built_melody(melody_id: str) -> None:
|
||||
if not row:
|
||||
raise HTTPException(status_code=404, detail=f"Built melody '{melody_id}' not found")
|
||||
|
||||
# Flag all assigned melodies as outdated before deleting
|
||||
assigned_ids = row.get("assigned_melody_ids", [])
|
||||
if assigned_ids:
|
||||
await _flag_melodies_outdated(assigned_ids, True)
|
||||
|
||||
# Delete the .bsm file if it exists
|
||||
if row.get("binary_path"):
|
||||
bsm_path = Path(row["binary_path"])
|
||||
@@ -205,26 +184,10 @@ async def delete_built_melody(melody_id: str) -> None:
|
||||
await db.delete_built_melody(melody_id)
|
||||
|
||||
|
||||
async def toggle_builtin(melody_id: str) -> BuiltMelodyInDB:
|
||||
"""Toggle the is_builtin flag for an archetype."""
|
||||
row = await db.get_built_melody(melody_id)
|
||||
if not row:
|
||||
raise HTTPException(status_code=404, detail=f"Built melody '{melody_id}' not found")
|
||||
new_value = not row.get("is_builtin", False)
|
||||
await db.update_builtin_flag(melody_id, new_value)
|
||||
return await get_built_melody(melody_id)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Build Actions
|
||||
# ============================================================================
|
||||
|
||||
async def _do_build(melody_id: str) -> BuiltMelodyInDB:
|
||||
"""Internal: build both binary and PROGMEM code, return updated record."""
|
||||
await build_binary(melody_id)
|
||||
return await build_builtin_code(melody_id)
|
||||
|
||||
|
||||
async def build_binary(melody_id: str) -> BuiltMelodyInDB:
|
||||
"""Parse steps and write a .bsm binary file to storage."""
|
||||
row = await db.get_built_melody(melody_id)
|
||||
@@ -273,48 +236,6 @@ async def get_binary_path(melody_id: str) -> Optional[Path]:
|
||||
return path
|
||||
|
||||
|
||||
async def generate_builtin_list() -> str:
|
||||
"""Generate a C++ header with PROGMEM arrays for all is_builtin archetypes."""
|
||||
rows = await db.list_built_melodies()
|
||||
builtin_rows = [r for r in rows if r.get("is_builtin")]
|
||||
|
||||
if not builtin_rows:
|
||||
return "// No built-in archetypes defined.\n"
|
||||
|
||||
timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
|
||||
parts = [
|
||||
f"// Auto-generated Built-in Archetype List",
|
||||
f"// Generated: {timestamp}",
|
||||
f"// Total built-ins: {len(builtin_rows)}",
|
||||
"",
|
||||
"#pragma once",
|
||||
"#include <avr/pgmspace.h>",
|
||||
"",
|
||||
]
|
||||
|
||||
entry_refs = []
|
||||
for row in builtin_rows:
|
||||
values = steps_string_to_values(row["steps"])
|
||||
array_name = f"melody_builtin_{row['name'].lower().replace(' ', '_')}"
|
||||
display_name = row["name"].replace("_", " ").title()
|
||||
pid = row.get("pid") or f"builtin_{row['name'].lower()}"
|
||||
|
||||
parts.append(f"// {display_name} | PID: {pid} | Steps: {len(values)}")
|
||||
parts.append(format_melody_array(row["name"].lower().replace(" ", "_"), values))
|
||||
parts.append("")
|
||||
entry_refs.append((display_name, pid, array_name, len(values)))
|
||||
|
||||
# Generate MELODY_LIBRARY array
|
||||
parts.append("// --- MELODY_LIBRARY entries ---")
|
||||
parts.append("// Add these to your firmware's MELODY_LIBRARY[] array:")
|
||||
parts.append("// {")
|
||||
for display_name, pid, array_name, step_count in entry_refs:
|
||||
parts.append(f'// {{ "{display_name}", "{pid}", {array_name}, {step_count} }},')
|
||||
parts.append("// };")
|
||||
|
||||
return "\n".join(parts)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Assignment
|
||||
# ============================================================================
|
||||
@@ -330,9 +251,6 @@ async def assign_to_melody(built_id: str, firestore_melody_id: str) -> BuiltMelo
|
||||
assigned.append(firestore_melody_id)
|
||||
await db.update_assigned_melody_ids(built_id, assigned)
|
||||
|
||||
# Clear outdated flag on the melody being assigned
|
||||
await _flag_melodies_outdated([firestore_melody_id], False)
|
||||
|
||||
return await get_built_melody(built_id)
|
||||
|
||||
|
||||
@@ -344,10 +262,6 @@ async def unassign_from_melody(built_id: str, firestore_melody_id: str) -> Built
|
||||
|
||||
assigned = [mid for mid in row.get("assigned_melody_ids", []) if mid != firestore_melody_id]
|
||||
await db.update_assigned_melody_ids(built_id, assigned)
|
||||
|
||||
# Flag the melody as outdated since it no longer has an archetype
|
||||
await _flag_melodies_outdated([firestore_melody_id], True)
|
||||
|
||||
return await get_built_melody(built_id)
|
||||
|
||||
|
||||
@@ -358,48 +272,3 @@ async def get_built_melody_for_firestore_id(firestore_melody_id: str) -> Optiona
|
||||
if firestore_melody_id in row.get("assigned_melody_ids", []):
|
||||
return _row_to_built_melody(row)
|
||||
return None
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Outdated Flag Helpers
|
||||
# ============================================================================
|
||||
|
||||
async def _flag_melodies_outdated(melody_ids: List[str], outdated: bool) -> None:
|
||||
"""Set or clear the outdated_archetype flag on a list of Firestore melody IDs.
|
||||
|
||||
This updates both SQLite (melody_drafts) and Firestore (published melodies).
|
||||
We import inline to avoid circular imports.
|
||||
"""
|
||||
if not melody_ids:
|
||||
return
|
||||
|
||||
try:
|
||||
from melodies import database as melody_db
|
||||
from shared.firebase import get_db as get_firestore
|
||||
except ImportError:
|
||||
logger.warning("Could not import melody/firebase modules — skipping outdated flag update")
|
||||
return
|
||||
|
||||
firestore_db = get_firestore()
|
||||
|
||||
for melody_id in melody_ids:
|
||||
try:
|
||||
row = await melody_db.get_melody(melody_id)
|
||||
if not row:
|
||||
continue
|
||||
|
||||
data = row["data"]
|
||||
info = dict(data.get("information", {}))
|
||||
info["outdated_archetype"] = outdated
|
||||
data["information"] = info
|
||||
|
||||
await melody_db.update_melody(melody_id, data)
|
||||
|
||||
# If published, also update Firestore
|
||||
if row.get("status") == "published":
|
||||
doc_ref = firestore_db.collection("melodies").document(melody_id)
|
||||
doc_ref.update({"information.outdated_archetype": outdated})
|
||||
|
||||
logger.info(f"Set outdated_archetype={outdated} on melody {melody_id}")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to set outdated flag on melody {melody_id}: {e}")
|
||||
|
||||
@@ -22,14 +22,13 @@ class Settings(BaseSettings):
|
||||
mosquitto_password_file: str = "/etc/mosquitto/passwd"
|
||||
mqtt_client_id: str = "bellsystems-admin-panel"
|
||||
|
||||
# SQLite (local application database)
|
||||
sqlite_db_path: str = "./data/database.db"
|
||||
# SQLite (MQTT data storage)
|
||||
sqlite_db_path: str = "./mqtt_data.db"
|
||||
mqtt_data_retention_days: int = 90
|
||||
|
||||
# Local file storage
|
||||
built_melodies_storage_path: str = "./storage/built_melodies"
|
||||
firmware_storage_path: str = "./storage/firmware"
|
||||
flash_assets_storage_path: str = "./storage/flash_assets"
|
||||
|
||||
# Email (Resend)
|
||||
resend_api_key: str = "re_placeholder_change_me"
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
import asyncio
|
||||
import logging
|
||||
from fastapi import APIRouter, Depends, Query, BackgroundTasks, Body
|
||||
from fastapi import APIRouter, Depends, Query, BackgroundTasks
|
||||
from typing import Optional
|
||||
|
||||
from auth.models import TokenPayload
|
||||
from auth.dependencies import require_permission
|
||||
from crm.models import CustomerCreate, CustomerUpdate, CustomerInDB, CustomerListResponse, TransactionEntry
|
||||
from crm.models import CustomerCreate, CustomerUpdate, CustomerInDB, CustomerListResponse
|
||||
from crm import service, nextcloud
|
||||
from config import settings
|
||||
|
||||
@@ -14,25 +14,15 @@ logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@router.get("", response_model=CustomerListResponse)
|
||||
async def list_customers(
|
||||
def list_customers(
|
||||
search: Optional[str] = Query(None),
|
||||
tag: Optional[str] = Query(None),
|
||||
sort: Optional[str] = Query(None),
|
||||
_user: TokenPayload = Depends(require_permission("crm", "view")),
|
||||
):
|
||||
customers = service.list_customers(search=search, tag=tag, sort=sort)
|
||||
if sort == "latest_comm":
|
||||
customers = await service.list_customers_sorted_by_latest_comm(customers)
|
||||
customers = service.list_customers(search=search, tag=tag)
|
||||
return CustomerListResponse(customers=customers, total=len(customers))
|
||||
|
||||
|
||||
@router.get("/tags", response_model=list[str])
|
||||
def list_tags(
|
||||
_user: TokenPayload = Depends(require_permission("crm", "view")),
|
||||
):
|
||||
return service.list_all_tags()
|
||||
|
||||
|
||||
@router.get("/{customer_id}", response_model=CustomerInDB)
|
||||
def get_customer(
|
||||
customer_id: str,
|
||||
@@ -74,172 +64,8 @@ def update_customer(
|
||||
|
||||
|
||||
@router.delete("/{customer_id}", status_code=204)
|
||||
async def delete_customer(
|
||||
def delete_customer(
|
||||
customer_id: str,
|
||||
wipe_comms: bool = Query(False),
|
||||
wipe_files: bool = Query(False),
|
||||
wipe_nextcloud: bool = Query(False),
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
customer = service.delete_customer(customer_id)
|
||||
nc_path = service.get_customer_nc_path(customer)
|
||||
|
||||
if wipe_comms or wipe_nextcloud:
|
||||
await service.delete_customer_comms(customer_id)
|
||||
|
||||
if wipe_files or wipe_nextcloud:
|
||||
await service.delete_customer_media_entries(customer_id)
|
||||
|
||||
if settings.nextcloud_url:
|
||||
folder = f"customers/{nc_path}"
|
||||
if wipe_nextcloud:
|
||||
try:
|
||||
await nextcloud.delete_file(folder)
|
||||
except Exception as e:
|
||||
logger.warning("Could not delete NC folder for customer %s: %s", customer_id, e)
|
||||
elif wipe_files:
|
||||
stale_folder = f"customers/STALE_{nc_path}"
|
||||
try:
|
||||
await nextcloud.rename_folder(folder, stale_folder)
|
||||
except Exception as e:
|
||||
logger.warning("Could not rename NC folder for customer %s: %s", customer_id, e)
|
||||
|
||||
|
||||
@router.get("/{customer_id}/last-comm-direction")
|
||||
async def get_last_comm_direction(
|
||||
customer_id: str,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "view")),
|
||||
):
|
||||
result = await service.get_last_comm_direction(customer_id)
|
||||
return result
|
||||
|
||||
|
||||
# ── Relationship Status ───────────────────────────────────────────────────────
|
||||
|
||||
@router.patch("/{customer_id}/relationship-status", response_model=CustomerInDB)
|
||||
def update_relationship_status(
|
||||
customer_id: str,
|
||||
body: dict = Body(...),
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.update_relationship_status(customer_id, body.get("status", ""))
|
||||
|
||||
|
||||
# ── Technical Issues ──────────────────────────────────────────────────────────
|
||||
|
||||
@router.post("/{customer_id}/technical-issues", response_model=CustomerInDB)
|
||||
def add_technical_issue(
|
||||
customer_id: str,
|
||||
body: dict = Body(...),
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.add_technical_issue(
|
||||
customer_id,
|
||||
note=body.get("note", ""),
|
||||
opened_by=body.get("opened_by", ""),
|
||||
date=body.get("date"),
|
||||
)
|
||||
|
||||
|
||||
@router.patch("/{customer_id}/technical-issues/{index}/resolve", response_model=CustomerInDB)
|
||||
def resolve_technical_issue(
|
||||
customer_id: str,
|
||||
index: int,
|
||||
body: dict = Body(...),
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.resolve_technical_issue(customer_id, index, body.get("resolved_by", ""))
|
||||
|
||||
|
||||
@router.patch("/{customer_id}/technical-issues/{index}", response_model=CustomerInDB)
|
||||
def edit_technical_issue(
|
||||
customer_id: str,
|
||||
index: int,
|
||||
body: dict = Body(...),
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.edit_technical_issue(customer_id, index, body.get("note", ""), body.get("opened_date"))
|
||||
|
||||
|
||||
@router.delete("/{customer_id}/technical-issues/{index}", response_model=CustomerInDB)
|
||||
def delete_technical_issue(
|
||||
customer_id: str,
|
||||
index: int,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.delete_technical_issue(customer_id, index)
|
||||
|
||||
|
||||
# ── Install Support ───────────────────────────────────────────────────────────
|
||||
|
||||
@router.post("/{customer_id}/install-support", response_model=CustomerInDB)
|
||||
def add_install_support(
|
||||
customer_id: str,
|
||||
body: dict = Body(...),
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.add_install_support(
|
||||
customer_id,
|
||||
note=body.get("note", ""),
|
||||
opened_by=body.get("opened_by", ""),
|
||||
date=body.get("date"),
|
||||
)
|
||||
|
||||
|
||||
@router.patch("/{customer_id}/install-support/{index}/resolve", response_model=CustomerInDB)
|
||||
def resolve_install_support(
|
||||
customer_id: str,
|
||||
index: int,
|
||||
body: dict = Body(...),
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.resolve_install_support(customer_id, index, body.get("resolved_by", ""))
|
||||
|
||||
|
||||
@router.patch("/{customer_id}/install-support/{index}", response_model=CustomerInDB)
|
||||
def edit_install_support(
|
||||
customer_id: str,
|
||||
index: int,
|
||||
body: dict = Body(...),
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.edit_install_support(customer_id, index, body.get("note", ""), body.get("opened_date"))
|
||||
|
||||
|
||||
@router.delete("/{customer_id}/install-support/{index}", response_model=CustomerInDB)
|
||||
def delete_install_support(
|
||||
customer_id: str,
|
||||
index: int,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.delete_install_support(customer_id, index)
|
||||
|
||||
|
||||
# ── Transactions ──────────────────────────────────────────────────────────────
|
||||
|
||||
@router.post("/{customer_id}/transactions", response_model=CustomerInDB)
|
||||
def add_transaction(
|
||||
customer_id: str,
|
||||
body: TransactionEntry,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.add_transaction(customer_id, body)
|
||||
|
||||
|
||||
@router.patch("/{customer_id}/transactions/{index}", response_model=CustomerInDB)
|
||||
def update_transaction(
|
||||
customer_id: str,
|
||||
index: int,
|
||||
body: TransactionEntry,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.update_transaction(customer_id, index, body)
|
||||
|
||||
|
||||
@router.delete("/{customer_id}/transactions/{index}", response_model=CustomerInDB)
|
||||
def delete_transaction(
|
||||
customer_id: str,
|
||||
index: int,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.delete_transaction(customer_id, index)
|
||||
service.delete_customer(customer_id)
|
||||
|
||||
@@ -23,7 +23,7 @@ from email import encoders
|
||||
from typing import List, Optional, Tuple
|
||||
|
||||
from config import settings
|
||||
import database as mqtt_db
|
||||
from mqtt import database as mqtt_db
|
||||
from crm.mail_accounts import get_mail_accounts, account_by_key, account_by_email
|
||||
|
||||
logger = logging.getLogger("crm.email_sync")
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
from enum import Enum
|
||||
from typing import Any, Dict, List, Optional
|
||||
from typing import List, Optional
|
||||
from pydantic import BaseModel
|
||||
|
||||
|
||||
@@ -35,10 +35,6 @@ class ProductCreate(BaseModel):
|
||||
sku: Optional[str] = None
|
||||
category: ProductCategory
|
||||
description: Optional[str] = None
|
||||
name_en: Optional[str] = None
|
||||
name_gr: Optional[str] = None
|
||||
description_en: Optional[str] = None
|
||||
description_gr: Optional[str] = None
|
||||
price: float
|
||||
currency: str = "EUR"
|
||||
costs: Optional[ProductCosts] = None
|
||||
@@ -53,10 +49,6 @@ class ProductUpdate(BaseModel):
|
||||
sku: Optional[str] = None
|
||||
category: Optional[ProductCategory] = None
|
||||
description: Optional[str] = None
|
||||
name_en: Optional[str] = None
|
||||
name_gr: Optional[str] = None
|
||||
description_en: Optional[str] = None
|
||||
description_gr: Optional[str] = None
|
||||
price: Optional[float] = None
|
||||
currency: Optional[str] = None
|
||||
costs: Optional[ProductCosts] = None
|
||||
@@ -122,55 +114,9 @@ class OwnedItem(BaseModel):
|
||||
|
||||
|
||||
class CustomerLocation(BaseModel):
|
||||
address: Optional[str] = None
|
||||
city: Optional[str] = None
|
||||
postal_code: Optional[str] = None
|
||||
region: Optional[str] = None
|
||||
country: Optional[str] = None
|
||||
|
||||
|
||||
# ── New customer status models ────────────────────────────────────────────────
|
||||
|
||||
class TechnicalIssue(BaseModel):
|
||||
active: bool = True
|
||||
opened_date: str # ISO string
|
||||
resolved_date: Optional[str] = None
|
||||
note: str
|
||||
opened_by: str
|
||||
resolved_by: Optional[str] = None
|
||||
|
||||
|
||||
class InstallSupportEntry(BaseModel):
|
||||
active: bool = True
|
||||
opened_date: str # ISO string
|
||||
resolved_date: Optional[str] = None
|
||||
note: str
|
||||
opened_by: str
|
||||
resolved_by: Optional[str] = None
|
||||
|
||||
|
||||
class TransactionEntry(BaseModel):
|
||||
date: str # ISO string
|
||||
flow: str # "invoice" | "payment" | "refund" | "credit"
|
||||
payment_type: Optional[str] = None # "cash" | "bank_transfer" | "card" | "paypal" — null for invoices
|
||||
category: str # "full_payment" | "advance" | "installment"
|
||||
amount: float
|
||||
currency: str = "EUR"
|
||||
invoice_ref: Optional[str] = None
|
||||
order_ref: Optional[str] = None
|
||||
recorded_by: str
|
||||
note: str = ""
|
||||
|
||||
|
||||
# Lightweight summary stored on customer doc for fast CustomerList expanded view
|
||||
class CrmSummary(BaseModel):
|
||||
active_order_status: Optional[str] = None
|
||||
active_order_status_date: Optional[str] = None
|
||||
active_order_title: Optional[str] = None
|
||||
active_issues_count: int = 0
|
||||
latest_issue_date: Optional[str] = None
|
||||
active_support_count: int = 0
|
||||
latest_support_date: Optional[str] = None
|
||||
region: Optional[str] = None
|
||||
|
||||
|
||||
class CustomerCreate(BaseModel):
|
||||
@@ -178,7 +124,6 @@ class CustomerCreate(BaseModel):
|
||||
name: str
|
||||
surname: Optional[str] = None
|
||||
organization: Optional[str] = None
|
||||
religion: Optional[str] = None
|
||||
contacts: List[CustomerContact] = []
|
||||
notes: List[CustomerNote] = []
|
||||
location: Optional[CustomerLocation] = None
|
||||
@@ -187,12 +132,7 @@ class CustomerCreate(BaseModel):
|
||||
owned_items: List[OwnedItem] = []
|
||||
linked_user_ids: List[str] = []
|
||||
nextcloud_folder: Optional[str] = None
|
||||
folder_id: Optional[str] = None
|
||||
relationship_status: str = "lead"
|
||||
technical_issues: List[Dict[str, Any]] = []
|
||||
install_support: List[Dict[str, Any]] = []
|
||||
transaction_history: List[Dict[str, Any]] = []
|
||||
crm_summary: Optional[Dict[str, Any]] = None
|
||||
folder_id: Optional[str] = None # Human-readable Nextcloud folder name, e.g. "saint-john-corfu"
|
||||
|
||||
|
||||
class CustomerUpdate(BaseModel):
|
||||
@@ -200,7 +140,6 @@ class CustomerUpdate(BaseModel):
|
||||
name: Optional[str] = None
|
||||
surname: Optional[str] = None
|
||||
organization: Optional[str] = None
|
||||
religion: Optional[str] = None
|
||||
contacts: Optional[List[CustomerContact]] = None
|
||||
notes: Optional[List[CustomerNote]] = None
|
||||
location: Optional[CustomerLocation] = None
|
||||
@@ -209,7 +148,6 @@ class CustomerUpdate(BaseModel):
|
||||
owned_items: Optional[List[OwnedItem]] = None
|
||||
linked_user_ids: Optional[List[str]] = None
|
||||
nextcloud_folder: Optional[str] = None
|
||||
relationship_status: Optional[str] = None
|
||||
# folder_id intentionally excluded from update — set once at creation
|
||||
|
||||
|
||||
@@ -227,34 +165,18 @@ class CustomerListResponse(BaseModel):
|
||||
# ── Orders ───────────────────────────────────────────────────────────────────
|
||||
|
||||
class OrderStatus(str, Enum):
|
||||
negotiating = "negotiating"
|
||||
awaiting_quotation = "awaiting_quotation"
|
||||
awaiting_customer_confirmation = "awaiting_customer_confirmation"
|
||||
awaiting_fulfilment = "awaiting_fulfilment"
|
||||
awaiting_payment = "awaiting_payment"
|
||||
manufacturing = "manufacturing"
|
||||
draft = "draft"
|
||||
confirmed = "confirmed"
|
||||
in_production = "in_production"
|
||||
shipped = "shipped"
|
||||
installed = "installed"
|
||||
declined = "declined"
|
||||
complete = "complete"
|
||||
delivered = "delivered"
|
||||
cancelled = "cancelled"
|
||||
|
||||
|
||||
class OrderPaymentStatus(BaseModel):
|
||||
required_amount: float = 0
|
||||
received_amount: float = 0
|
||||
balance_due: float = 0
|
||||
advance_required: bool = False
|
||||
advance_amount: Optional[float] = None
|
||||
payment_complete: bool = False
|
||||
|
||||
|
||||
class OrderTimelineEvent(BaseModel):
|
||||
date: str # ISO string
|
||||
type: str # "quote_request" | "quote_sent" | "quote_accepted" | "quote_declined"
|
||||
# | "mfg_started" | "mfg_complete" | "order_shipped" | "installed"
|
||||
# | "payment_received" | "invoice_sent" | "note"
|
||||
note: str = ""
|
||||
updated_by: str
|
||||
class PaymentStatus(str, Enum):
|
||||
pending = "pending"
|
||||
partial = "partial"
|
||||
paid = "paid"
|
||||
|
||||
|
||||
class OrderDiscount(BaseModel):
|
||||
@@ -285,36 +207,29 @@ class OrderItem(BaseModel):
|
||||
class OrderCreate(BaseModel):
|
||||
customer_id: str
|
||||
order_number: Optional[str] = None
|
||||
title: Optional[str] = None
|
||||
created_by: Optional[str] = None
|
||||
status: OrderStatus = OrderStatus.negotiating
|
||||
status_updated_date: Optional[str] = None
|
||||
status_updated_by: Optional[str] = None
|
||||
status: OrderStatus = OrderStatus.draft
|
||||
items: List[OrderItem] = []
|
||||
subtotal: float = 0
|
||||
discount: Optional[OrderDiscount] = None
|
||||
total_price: float = 0
|
||||
currency: str = "EUR"
|
||||
shipping: Optional[OrderShipping] = None
|
||||
payment_status: Optional[Dict[str, Any]] = None
|
||||
payment_status: PaymentStatus = PaymentStatus.pending
|
||||
invoice_path: Optional[str] = None
|
||||
notes: Optional[str] = None
|
||||
timeline: List[Dict[str, Any]] = []
|
||||
|
||||
|
||||
class OrderUpdate(BaseModel):
|
||||
customer_id: Optional[str] = None
|
||||
order_number: Optional[str] = None
|
||||
title: Optional[str] = None
|
||||
status: Optional[OrderStatus] = None
|
||||
status_updated_date: Optional[str] = None
|
||||
status_updated_by: Optional[str] = None
|
||||
items: Optional[List[OrderItem]] = None
|
||||
subtotal: Optional[float] = None
|
||||
discount: Optional[OrderDiscount] = None
|
||||
total_price: Optional[float] = None
|
||||
currency: Optional[str] = None
|
||||
shipping: Optional[OrderShipping] = None
|
||||
payment_status: Optional[Dict[str, Any]] = None
|
||||
payment_status: Optional[PaymentStatus] = None
|
||||
invoice_path: Optional[str] = None
|
||||
notes: Optional[str] = None
|
||||
|
||||
@@ -371,11 +286,8 @@ class CommCreate(BaseModel):
|
||||
|
||||
|
||||
class CommUpdate(BaseModel):
|
||||
type: Optional[CommType] = None
|
||||
direction: Optional[CommDirection] = None
|
||||
subject: Optional[str] = None
|
||||
body: Optional[str] = None
|
||||
logged_by: Optional[str] = None
|
||||
occurred_at: Optional[str] = None
|
||||
|
||||
|
||||
@@ -421,7 +333,6 @@ class MediaCreate(BaseModel):
|
||||
direction: Optional[MediaDirection] = None
|
||||
tags: List[str] = []
|
||||
uploaded_by: Optional[str] = None
|
||||
thumbnail_path: Optional[str] = None
|
||||
|
||||
|
||||
class MediaInDB(BaseModel):
|
||||
@@ -435,7 +346,6 @@ class MediaInDB(BaseModel):
|
||||
tags: List[str] = []
|
||||
uploaded_by: Optional[str] = None
|
||||
created_at: str
|
||||
thumbnail_path: Optional[str] = None
|
||||
|
||||
|
||||
class MediaListResponse(BaseModel):
|
||||
|
||||
@@ -312,18 +312,3 @@ async def delete_file(relative_path: str) -> None:
|
||||
resp = await client.request("DELETE", url, auth=_auth())
|
||||
if resp.status_code not in (200, 204, 404):
|
||||
raise HTTPException(status_code=502, detail=f"Nextcloud delete failed: {resp.status_code}")
|
||||
|
||||
|
||||
async def rename_folder(old_relative_path: str, new_relative_path: str) -> None:
|
||||
"""Rename/move a folder in Nextcloud using WebDAV MOVE."""
|
||||
url = _full_url(old_relative_path)
|
||||
destination = _full_url(new_relative_path)
|
||||
client = _get_client()
|
||||
resp = await client.request(
|
||||
"MOVE",
|
||||
url,
|
||||
auth=_auth(),
|
||||
headers={"Destination": destination, "Overwrite": "F"},
|
||||
)
|
||||
if resp.status_code not in (201, 204):
|
||||
raise HTTPException(status_code=502, detail=f"Nextcloud rename failed: {resp.status_code}")
|
||||
|
||||
@@ -10,7 +10,6 @@ Folder convention (all paths relative to nextcloud_base_path = BellSystems/Conso
|
||||
folder_id = customer.folder_id if set, else customer.id (legacy fallback).
|
||||
"""
|
||||
from fastapi import APIRouter, Depends, Query, UploadFile, File, Form, Response, HTTPException, Request
|
||||
from fastapi.responses import StreamingResponse
|
||||
from typing import Optional
|
||||
|
||||
from jose import JWTError
|
||||
@@ -18,9 +17,7 @@ from auth.models import TokenPayload
|
||||
from auth.dependencies import require_permission
|
||||
from auth.utils import decode_access_token
|
||||
from crm import nextcloud, service
|
||||
from config import settings
|
||||
from crm.models import MediaCreate, MediaDirection
|
||||
from crm.thumbnails import generate_thumbnail
|
||||
|
||||
router = APIRouter(prefix="/api/crm/nextcloud", tags=["crm-nextcloud"])
|
||||
|
||||
@@ -33,29 +30,6 @@ DIRECTION_MAP = {
|
||||
}
|
||||
|
||||
|
||||
@router.get("/web-url")
|
||||
async def get_web_url(
|
||||
path: str = Query(..., description="Path relative to nextcloud_base_path"),
|
||||
_user: TokenPayload = Depends(require_permission("crm", "view")),
|
||||
):
|
||||
"""
|
||||
Return the Nextcloud Files web-UI URL for a given file path.
|
||||
Opens the parent folder with the file highlighted.
|
||||
"""
|
||||
if not settings.nextcloud_url:
|
||||
raise HTTPException(status_code=503, detail="Nextcloud not configured")
|
||||
base = settings.nextcloud_base_path.strip("/")
|
||||
# path is relative to base, e.g. "customers/abc/media/photo.jpg"
|
||||
parts = path.rsplit("/", 1)
|
||||
folder_rel = parts[0] if len(parts) == 2 else ""
|
||||
filename = parts[-1]
|
||||
nc_dir = f"/{base}/{folder_rel}" if folder_rel else f"/{base}"
|
||||
from urllib.parse import urlencode, quote
|
||||
qs = urlencode({"dir": nc_dir, "scrollto": filename})
|
||||
url = f"{settings.nextcloud_url.rstrip('/')}/index.php/apps/files/?{qs}"
|
||||
return {"url": url}
|
||||
|
||||
|
||||
@router.get("/browse")
|
||||
async def browse(
|
||||
path: str = Query(..., description="Path relative to nextcloud_base_path"),
|
||||
@@ -82,14 +56,6 @@ async def browse_all(
|
||||
|
||||
all_files = await nextcloud.list_folder_recursive(base)
|
||||
|
||||
# Exclude _info.txt stubs — human-readable only, should never appear in the UI.
|
||||
# .thumbs/ files are kept: the frontend needs them to build the thumbnail map
|
||||
# (it already filters them out of the visible file list itself).
|
||||
all_files = [
|
||||
f for f in all_files
|
||||
if not f["path"].endswith("/_info.txt")
|
||||
]
|
||||
|
||||
# Tag each file with the top-level subfolder it lives under
|
||||
for item in all_files:
|
||||
parts = item["path"].split("/")
|
||||
@@ -118,54 +84,33 @@ async def proxy_file(
|
||||
except (JWTError, KeyError):
|
||||
raise HTTPException(status_code=403, detail="Invalid token")
|
||||
|
||||
# Forward the Range header to Nextcloud so we get a true partial response
|
||||
# without buffering the whole file into memory.
|
||||
nc_url = nextcloud._full_url(path)
|
||||
nc_auth = nextcloud._auth()
|
||||
forward_headers = {}
|
||||
content, mime_type = await nextcloud.download_file(path)
|
||||
total = len(content)
|
||||
|
||||
range_header = request.headers.get("range")
|
||||
if range_header:
|
||||
forward_headers["Range"] = range_header
|
||||
|
||||
import httpx as _httpx
|
||||
|
||||
# Use a dedicated streaming client — httpx.stream() keeps the connection open
|
||||
# for the lifetime of the generator, so we can't reuse the shared persistent client.
|
||||
# We enter the stream context here to get headers immediately (no body buffering),
|
||||
# then hand the body iterator to StreamingResponse.
|
||||
stream_client = _httpx.AsyncClient(timeout=None, follow_redirects=True)
|
||||
nc_resp_ctx = stream_client.stream("GET", nc_url, auth=nc_auth, headers=forward_headers)
|
||||
nc_resp = await nc_resp_ctx.__aenter__()
|
||||
|
||||
if nc_resp.status_code == 404:
|
||||
await nc_resp_ctx.__aexit__(None, None, None)
|
||||
await stream_client.aclose()
|
||||
raise HTTPException(status_code=404, detail="File not found in Nextcloud")
|
||||
if nc_resp.status_code not in (200, 206):
|
||||
await nc_resp_ctx.__aexit__(None, None, None)
|
||||
await stream_client.aclose()
|
||||
raise HTTPException(status_code=502, detail=f"Nextcloud returned {nc_resp.status_code}")
|
||||
|
||||
mime_type = nc_resp.headers.get("content-type", "application/octet-stream").split(";")[0].strip()
|
||||
|
||||
resp_headers = {"Accept-Ranges": "bytes"}
|
||||
for h in ("content-range", "content-length"):
|
||||
if h in nc_resp.headers:
|
||||
resp_headers[h.title()] = nc_resp.headers[h]
|
||||
|
||||
async def _stream():
|
||||
if range_header and range_header.startswith("bytes="):
|
||||
# Parse "bytes=start-end"
|
||||
try:
|
||||
async for chunk in nc_resp.aiter_bytes(chunk_size=64 * 1024):
|
||||
yield chunk
|
||||
finally:
|
||||
await nc_resp_ctx.__aexit__(None, None, None)
|
||||
await stream_client.aclose()
|
||||
range_spec = range_header[6:]
|
||||
start_str, _, end_str = range_spec.partition("-")
|
||||
start = int(start_str) if start_str else 0
|
||||
end = int(end_str) if end_str else total - 1
|
||||
end = min(end, total - 1)
|
||||
chunk = content[start:end + 1]
|
||||
headers = {
|
||||
"Content-Range": f"bytes {start}-{end}/{total}",
|
||||
"Accept-Ranges": "bytes",
|
||||
"Content-Length": str(len(chunk)),
|
||||
"Content-Type": mime_type,
|
||||
}
|
||||
return Response(content=chunk, status_code=206, headers=headers, media_type=mime_type)
|
||||
except (ValueError, IndexError):
|
||||
pass
|
||||
|
||||
return StreamingResponse(
|
||||
_stream(),
|
||||
status_code=nc_resp.status_code,
|
||||
return Response(
|
||||
content=content,
|
||||
media_type=mime_type,
|
||||
headers=resp_headers,
|
||||
headers={"Accept-Ranges": "bytes", "Content-Length": str(total)},
|
||||
)
|
||||
|
||||
|
||||
@@ -219,24 +164,6 @@ async def upload_file(
|
||||
mime_type = file.content_type or "application/octet-stream"
|
||||
await nextcloud.upload_file(file_path, content, mime_type)
|
||||
|
||||
# Generate and upload thumbnail (best-effort, non-blocking)
|
||||
# Always stored as {stem}.jpg regardless of source extension so the thumb
|
||||
# filename is unambiguous and the existence check can never false-positive.
|
||||
thumb_path = None
|
||||
try:
|
||||
thumb_bytes = generate_thumbnail(content, mime_type, file.filename)
|
||||
if thumb_bytes:
|
||||
thumb_folder = f"{target_folder}/.thumbs"
|
||||
stem = file.filename.rsplit(".", 1)[0] if "." in file.filename else file.filename
|
||||
thumb_filename = f"{stem}.jpg"
|
||||
thumb_nc_path = f"{thumb_folder}/{thumb_filename}"
|
||||
await nextcloud.ensure_folder(thumb_folder)
|
||||
await nextcloud.upload_file(thumb_nc_path, thumb_bytes, "image/jpeg")
|
||||
thumb_path = thumb_nc_path
|
||||
except Exception as e:
|
||||
import logging
|
||||
logging.getLogger(__name__).warning("Thumbnail generation failed for %s: %s", file.filename, e)
|
||||
|
||||
# Resolve direction
|
||||
resolved_direction = None
|
||||
if direction:
|
||||
@@ -257,7 +184,6 @@ async def upload_file(
|
||||
direction=resolved_direction,
|
||||
tags=tag_list,
|
||||
uploaded_by=_user.name,
|
||||
thumbnail_path=thumb_path,
|
||||
))
|
||||
|
||||
return media_record
|
||||
@@ -318,11 +244,6 @@ async def sync_nextcloud_files(
|
||||
|
||||
# Collect all NC files recursively (handles nested folders at any depth)
|
||||
all_nc_files = await nextcloud.list_folder_recursive(base)
|
||||
# Skip .thumbs/ folder contents and the _info.txt stub — these are internal
|
||||
all_nc_files = [
|
||||
f for f in all_nc_files
|
||||
if "/.thumbs/" not in f["path"] and not f["path"].endswith("/_info.txt")
|
||||
]
|
||||
for item in all_nc_files:
|
||||
parts = item["path"].split("/")
|
||||
item["_subfolder"] = parts[2] if len(parts) > 2 else "media"
|
||||
@@ -353,105 +274,6 @@ async def sync_nextcloud_files(
|
||||
return {"synced": synced, "skipped": skipped}
|
||||
|
||||
|
||||
@router.post("/generate-thumbs")
|
||||
async def generate_thumbs(
|
||||
customer_id: str = Form(...),
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
"""
|
||||
Scan all customer files in Nextcloud and generate thumbnails for any file
|
||||
that doesn't already have one in the corresponding .thumbs/ sub-folder.
|
||||
Skips files inside .thumbs/ itself and file types that can't be thumbnailed.
|
||||
Returns counts of generated, skipped (already exists), and failed files.
|
||||
"""
|
||||
customer = service.get_customer(customer_id)
|
||||
nc_path = service.get_customer_nc_path(customer)
|
||||
base = f"customers/{nc_path}"
|
||||
|
||||
all_nc_files = await nextcloud.list_folder_recursive(base)
|
||||
|
||||
# Build a set of existing thumb paths for O(1) lookup
|
||||
existing_thumbs = {
|
||||
f["path"] for f in all_nc_files if "/.thumbs/" in f["path"]
|
||||
}
|
||||
|
||||
# Only process real files (not thumbs themselves)
|
||||
candidates = [f for f in all_nc_files if "/.thumbs/" not in f["path"]]
|
||||
|
||||
generated = 0
|
||||
skipped = 0
|
||||
failed = 0
|
||||
|
||||
for f in candidates:
|
||||
# Derive where the thumb would live
|
||||
path = f["path"] # e.g. customers/{nc_path}/{subfolder}/photo.jpg
|
||||
parts = path.rsplit("/", 1)
|
||||
if len(parts) != 2:
|
||||
skipped += 1
|
||||
continue
|
||||
parent_folder, filename = parts
|
||||
stem = filename.rsplit(".", 1)[0] if "." in filename else filename
|
||||
thumb_filename = f"{stem}.jpg"
|
||||
thumb_nc_path = f"{parent_folder}/.thumbs/{thumb_filename}"
|
||||
|
||||
if thumb_nc_path in existing_thumbs:
|
||||
skipped += 1
|
||||
continue
|
||||
|
||||
# Download the file, generate thumb, upload
|
||||
try:
|
||||
content, mime_type = await nextcloud.download_file(path)
|
||||
thumb_bytes = generate_thumbnail(content, mime_type, filename)
|
||||
if not thumb_bytes:
|
||||
skipped += 1 # unsupported file type
|
||||
continue
|
||||
thumb_folder = f"{parent_folder}/.thumbs"
|
||||
await nextcloud.ensure_folder(thumb_folder)
|
||||
await nextcloud.upload_file(thumb_nc_path, thumb_bytes, "image/jpeg")
|
||||
generated += 1
|
||||
except Exception as e:
|
||||
import logging
|
||||
logging.getLogger(__name__).warning("Thumb gen failed for %s: %s", path, e)
|
||||
failed += 1
|
||||
|
||||
return {"generated": generated, "skipped": skipped, "failed": failed}
|
||||
|
||||
|
||||
@router.post("/clear-thumbs")
|
||||
async def clear_thumbs(
|
||||
customer_id: str = Form(...),
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
"""
|
||||
Delete all .thumbs sub-folders for a customer across all subfolders.
|
||||
This lets you regenerate thumbnails from scratch.
|
||||
Returns count of .thumbs folders deleted.
|
||||
"""
|
||||
customer = service.get_customer(customer_id)
|
||||
nc_path = service.get_customer_nc_path(customer)
|
||||
base = f"customers/{nc_path}"
|
||||
|
||||
all_nc_files = await nextcloud.list_folder_recursive(base)
|
||||
|
||||
# Collect unique .thumbs folder paths
|
||||
thumb_folders = set()
|
||||
for f in all_nc_files:
|
||||
if "/.thumbs/" in f["path"]:
|
||||
folder = f["path"].split("/.thumbs/")[0] + "/.thumbs"
|
||||
thumb_folders.add(folder)
|
||||
|
||||
deleted = 0
|
||||
for folder in thumb_folders:
|
||||
try:
|
||||
await nextcloud.delete_file(folder)
|
||||
deleted += 1
|
||||
except Exception as e:
|
||||
import logging
|
||||
logging.getLogger(__name__).warning("Failed to delete .thumbs folder %s: %s", folder, e)
|
||||
|
||||
return {"deleted_folders": deleted}
|
||||
|
||||
|
||||
@router.post("/untrack-deleted")
|
||||
async def untrack_deleted_files(
|
||||
customer_id: str = Form(...),
|
||||
@@ -465,22 +287,15 @@ async def untrack_deleted_files(
|
||||
nc_path = service.get_customer_nc_path(customer)
|
||||
base = f"customers/{nc_path}"
|
||||
|
||||
# Collect all NC file paths recursively (excluding thumbs and info stub)
|
||||
# Collect all NC file paths recursively
|
||||
all_nc_files = await nextcloud.list_folder_recursive(base)
|
||||
nc_paths = {
|
||||
item["path"] for item in all_nc_files
|
||||
if "/.thumbs/" not in item["path"] and not item["path"].endswith("/_info.txt")
|
||||
}
|
||||
nc_paths = {item["path"] for item in all_nc_files}
|
||||
|
||||
# Find DB records whose NC path no longer exists, OR that are internal files
|
||||
# (_info.txt / .thumbs/) which should never have been tracked in the first place.
|
||||
# Find DB records whose NC path no longer exists
|
||||
existing = await service.list_media(customer_id=customer_id)
|
||||
untracked = 0
|
||||
for m in existing:
|
||||
is_internal = m.nextcloud_path and (
|
||||
"/.thumbs/" in m.nextcloud_path or m.nextcloud_path.endswith("/_info.txt")
|
||||
)
|
||||
if m.nextcloud_path and (is_internal or m.nextcloud_path not in nc_paths):
|
||||
if m.nextcloud_path and m.nextcloud_path not in nc_paths:
|
||||
try:
|
||||
await service.delete_media(m.id)
|
||||
untracked += 1
|
||||
|
||||
@@ -6,146 +6,52 @@ from auth.dependencies import require_permission
|
||||
from crm.models import OrderCreate, OrderUpdate, OrderInDB, OrderListResponse
|
||||
from crm import service
|
||||
|
||||
router = APIRouter(prefix="/api/crm/customers/{customer_id}/orders", tags=["crm-orders"])
|
||||
router = APIRouter(prefix="/api/crm/orders", tags=["crm-orders"])
|
||||
|
||||
|
||||
@router.get("", response_model=OrderListResponse)
|
||||
def list_orders(
|
||||
customer_id: str,
|
||||
customer_id: Optional[str] = Query(None),
|
||||
status: Optional[str] = Query(None),
|
||||
payment_status: Optional[str] = Query(None),
|
||||
_user: TokenPayload = Depends(require_permission("crm", "view")),
|
||||
):
|
||||
orders = service.list_orders(customer_id)
|
||||
return OrderListResponse(orders=orders, total=len(orders))
|
||||
|
||||
|
||||
# IMPORTANT: specific sub-paths must come before /{order_id}
|
||||
@router.get("/next-order-number")
|
||||
def get_next_order_number(
|
||||
customer_id: str,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "view")),
|
||||
):
|
||||
"""Return the next globally unique order number (ORD-DDMMYY-NNN across all customers)."""
|
||||
return {"order_number": service._generate_order_number(customer_id)}
|
||||
|
||||
|
||||
@router.post("/init-negotiations", response_model=OrderInDB, status_code=201)
|
||||
def init_negotiations(
|
||||
customer_id: str,
|
||||
body: dict,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.init_negotiations(
|
||||
orders = service.list_orders(
|
||||
customer_id=customer_id,
|
||||
title=body.get("title", ""),
|
||||
note=body.get("note", ""),
|
||||
date=body.get("date"),
|
||||
created_by=body.get("created_by", ""),
|
||||
status=status,
|
||||
payment_status=payment_status,
|
||||
)
|
||||
|
||||
|
||||
@router.post("", response_model=OrderInDB, status_code=201)
|
||||
def create_order(
|
||||
customer_id: str,
|
||||
body: OrderCreate,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.create_order(customer_id, body)
|
||||
return OrderListResponse(orders=orders, total=len(orders))
|
||||
|
||||
|
||||
@router.get("/{order_id}", response_model=OrderInDB)
|
||||
def get_order(
|
||||
customer_id: str,
|
||||
order_id: str,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "view")),
|
||||
):
|
||||
return service.get_order(customer_id, order_id)
|
||||
return service.get_order(order_id)
|
||||
|
||||
|
||||
@router.patch("/{order_id}", response_model=OrderInDB)
|
||||
@router.post("", response_model=OrderInDB, status_code=201)
|
||||
def create_order(
|
||||
body: OrderCreate,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.create_order(body)
|
||||
|
||||
|
||||
@router.put("/{order_id}", response_model=OrderInDB)
|
||||
def update_order(
|
||||
customer_id: str,
|
||||
order_id: str,
|
||||
body: OrderUpdate,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.update_order(customer_id, order_id, body)
|
||||
return service.update_order(order_id, body)
|
||||
|
||||
|
||||
@router.delete("/{order_id}", status_code=204)
|
||||
def delete_order(
|
||||
customer_id: str,
|
||||
order_id: str,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
service.delete_order(customer_id, order_id)
|
||||
|
||||
|
||||
@router.post("/{order_id}/timeline", response_model=OrderInDB)
|
||||
def append_timeline_event(
|
||||
customer_id: str,
|
||||
order_id: str,
|
||||
body: dict,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.append_timeline_event(customer_id, order_id, body)
|
||||
|
||||
|
||||
@router.patch("/{order_id}/timeline/{index}", response_model=OrderInDB)
|
||||
def update_timeline_event(
|
||||
customer_id: str,
|
||||
order_id: str,
|
||||
index: int,
|
||||
body: dict,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.update_timeline_event(customer_id, order_id, index, body)
|
||||
|
||||
|
||||
@router.delete("/{order_id}/timeline/{index}", response_model=OrderInDB)
|
||||
def delete_timeline_event(
|
||||
customer_id: str,
|
||||
order_id: str,
|
||||
index: int,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.delete_timeline_event(customer_id, order_id, index)
|
||||
|
||||
|
||||
@router.patch("/{order_id}/payment-status", response_model=OrderInDB)
|
||||
def update_payment_status(
|
||||
customer_id: str,
|
||||
order_id: str,
|
||||
body: dict,
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
return service.update_order_payment_status(customer_id, order_id, body)
|
||||
|
||||
|
||||
# ── Global order list (collection group) ─────────────────────────────────────
|
||||
# Separate router registered at /api/crm/orders for the global OrderList page
|
||||
|
||||
global_router = APIRouter(prefix="/api/crm/orders", tags=["crm-orders-global"])
|
||||
|
||||
|
||||
@global_router.get("")
|
||||
def list_all_orders(
|
||||
status: Optional[str] = Query(None),
|
||||
_user: TokenPayload = Depends(require_permission("crm", "view")),
|
||||
):
|
||||
orders = service.list_all_orders(status=status)
|
||||
# Enrich with customer names
|
||||
customer_ids = list({o.customer_id for o in orders if o.customer_id})
|
||||
customer_names: dict[str, str] = {}
|
||||
for cid in customer_ids:
|
||||
try:
|
||||
c = service.get_customer(cid)
|
||||
parts = [c.name, c.organization] if c.organization else [c.name]
|
||||
customer_names[cid] = " / ".join(filter(None, parts))
|
||||
except Exception:
|
||||
pass
|
||||
enriched = []
|
||||
for o in orders:
|
||||
d = o.model_dump()
|
||||
d["customer_name"] = customer_names.get(o.customer_id)
|
||||
enriched.append(d)
|
||||
return {"orders": enriched, "total": len(enriched)}
|
||||
service.delete_order(order_id)
|
||||
|
||||
@@ -13,8 +13,6 @@ class QuotationStatus(str, Enum):
|
||||
class QuotationItemCreate(BaseModel):
|
||||
product_id: Optional[str] = None
|
||||
description: Optional[str] = None
|
||||
description_en: Optional[str] = None
|
||||
description_gr: Optional[str] = None
|
||||
unit_type: str = "pcs" # pcs / kg / m
|
||||
unit_cost: float = 0.0
|
||||
discount_percent: float = 0.0
|
||||
@@ -54,10 +52,6 @@ class QuotationCreate(BaseModel):
|
||||
client_location: Optional[str] = None
|
||||
client_phone: Optional[str] = None
|
||||
client_email: Optional[str] = None
|
||||
# Legacy quotation fields
|
||||
is_legacy: bool = False
|
||||
legacy_date: Optional[str] = None # ISO date string, manually set
|
||||
legacy_pdf_path: Optional[str] = None # Nextcloud path to uploaded PDF
|
||||
|
||||
|
||||
class QuotationUpdate(BaseModel):
|
||||
@@ -85,10 +79,6 @@ class QuotationUpdate(BaseModel):
|
||||
client_location: Optional[str] = None
|
||||
client_phone: Optional[str] = None
|
||||
client_email: Optional[str] = None
|
||||
# Legacy quotation fields
|
||||
is_legacy: Optional[bool] = None
|
||||
legacy_date: Optional[str] = None
|
||||
legacy_pdf_path: Optional[str] = None
|
||||
|
||||
|
||||
class QuotationInDB(BaseModel):
|
||||
@@ -128,10 +118,6 @@ class QuotationInDB(BaseModel):
|
||||
client_location: Optional[str] = None
|
||||
client_phone: Optional[str] = None
|
||||
client_email: Optional[str] = None
|
||||
# Legacy quotation fields
|
||||
is_legacy: bool = False
|
||||
legacy_date: Optional[str] = None
|
||||
legacy_pdf_path: Optional[str] = None
|
||||
|
||||
|
||||
class QuotationListItem(BaseModel):
|
||||
@@ -144,9 +130,6 @@ class QuotationListItem(BaseModel):
|
||||
created_at: str
|
||||
updated_at: str
|
||||
nextcloud_pdf_url: Optional[str] = None
|
||||
is_legacy: bool = False
|
||||
legacy_date: Optional[str] = None
|
||||
legacy_pdf_path: Optional[str] = None
|
||||
|
||||
|
||||
class QuotationListResponse(BaseModel):
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
from fastapi import APIRouter, Depends, Query, UploadFile, File
|
||||
from fastapi import APIRouter, Depends, Query
|
||||
from fastapi.responses import StreamingResponse
|
||||
from typing import Optional
|
||||
import io
|
||||
@@ -28,14 +28,6 @@ async def get_next_number(
|
||||
return NextNumberResponse(next_number=next_num)
|
||||
|
||||
|
||||
@router.get("/all", response_model=list[dict])
|
||||
async def list_all_quotations(
|
||||
_user: TokenPayload = Depends(require_permission("crm", "view")),
|
||||
):
|
||||
"""Returns all quotations across all customers, each including customer_name."""
|
||||
return await svc.list_all_quotations()
|
||||
|
||||
|
||||
@router.get("/customer/{customer_id}", response_model=QuotationListResponse)
|
||||
async def list_quotations_for_customer(
|
||||
customer_id: str,
|
||||
@@ -107,15 +99,3 @@ async def regenerate_pdf(
|
||||
):
|
||||
"""Force PDF regeneration and re-upload to Nextcloud."""
|
||||
return await svc.regenerate_pdf(quotation_id)
|
||||
|
||||
|
||||
@router.post("/{quotation_id}/legacy-pdf", response_model=QuotationInDB)
|
||||
async def upload_legacy_pdf(
|
||||
quotation_id: str,
|
||||
file: UploadFile = File(...),
|
||||
_user: TokenPayload = Depends(require_permission("crm", "edit")),
|
||||
):
|
||||
"""Upload a PDF file for a legacy quotation and store its Nextcloud path."""
|
||||
pdf_bytes = await file.read()
|
||||
filename = file.filename or f"legacy-{quotation_id}.pdf"
|
||||
return await svc.upload_legacy_pdf(quotation_id, pdf_bytes, filename)
|
||||
|
||||
@@ -19,7 +19,7 @@ from crm.quotation_models import (
|
||||
QuotationUpdate,
|
||||
)
|
||||
from crm.service import get_customer
|
||||
import database as mqtt_db
|
||||
from mqtt import database as mqtt_db
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -153,42 +153,10 @@ async def get_next_number() -> str:
|
||||
return await _generate_quotation_number(db)
|
||||
|
||||
|
||||
async def list_all_quotations() -> list[dict]:
|
||||
"""Return all quotations across all customers, with customer_name injected."""
|
||||
from shared.firebase import get_db as get_firestore
|
||||
db = await mqtt_db.get_db()
|
||||
rows = await db.execute_fetchall(
|
||||
"SELECT id, quotation_number, title, customer_id, status, final_total, created_at, updated_at, "
|
||||
"nextcloud_pdf_url, is_legacy, legacy_date, legacy_pdf_path "
|
||||
"FROM crm_quotations ORDER BY created_at DESC",
|
||||
(),
|
||||
)
|
||||
items = [dict(r) for r in rows]
|
||||
# Fetch unique customer names from Firestore in one pass
|
||||
customer_ids = {i["customer_id"] for i in items if i.get("customer_id")}
|
||||
customer_names: dict[str, str] = {}
|
||||
if customer_ids:
|
||||
fstore = get_firestore()
|
||||
for cid in customer_ids:
|
||||
try:
|
||||
doc = fstore.collection("crm_customers").document(cid).get()
|
||||
if doc.exists:
|
||||
d = doc.to_dict()
|
||||
parts = [d.get("name", ""), d.get("surname", ""), d.get("organization", "")]
|
||||
label = " ".join(p for p in parts if p).strip()
|
||||
customer_names[cid] = label or cid
|
||||
except Exception:
|
||||
customer_names[cid] = cid
|
||||
for item in items:
|
||||
item["customer_name"] = customer_names.get(item["customer_id"], "")
|
||||
return items
|
||||
|
||||
|
||||
async def list_quotations(customer_id: str) -> list[QuotationListItem]:
|
||||
db = await mqtt_db.get_db()
|
||||
rows = await db.execute_fetchall(
|
||||
"SELECT id, quotation_number, title, customer_id, status, final_total, created_at, updated_at, "
|
||||
"nextcloud_pdf_url, is_legacy, legacy_date, legacy_pdf_path "
|
||||
"SELECT id, quotation_number, title, customer_id, status, final_total, created_at, updated_at, nextcloud_pdf_url "
|
||||
"FROM crm_quotations WHERE customer_id = ? ORDER BY created_at DESC",
|
||||
(customer_id,),
|
||||
)
|
||||
@@ -242,7 +210,6 @@ async def create_quotation(data: QuotationCreate, generate_pdf: bool = False) ->
|
||||
subtotal_before_discount, global_discount_amount, new_subtotal, vat_amount, final_total,
|
||||
nextcloud_pdf_path, nextcloud_pdf_url,
|
||||
client_org, client_name, client_location, client_phone, client_email,
|
||||
is_legacy, legacy_date, legacy_pdf_path,
|
||||
created_at, updated_at
|
||||
) VALUES (
|
||||
?, ?, ?, ?, ?,
|
||||
@@ -253,7 +220,6 @@ async def create_quotation(data: QuotationCreate, generate_pdf: bool = False) ->
|
||||
?, ?, ?, ?, ?,
|
||||
NULL, NULL,
|
||||
?, ?, ?, ?, ?,
|
||||
?, ?, ?,
|
||||
?, ?
|
||||
)""",
|
||||
(
|
||||
@@ -265,7 +231,6 @@ async def create_quotation(data: QuotationCreate, generate_pdf: bool = False) ->
|
||||
totals["subtotal_before_discount"], totals["global_discount_amount"],
|
||||
totals["new_subtotal"], totals["vat_amount"], totals["final_total"],
|
||||
data.client_org, data.client_name, data.client_location, data.client_phone, data.client_email,
|
||||
1 if data.is_legacy else 0, data.legacy_date, data.legacy_pdf_path,
|
||||
now, now,
|
||||
),
|
||||
)
|
||||
@@ -275,12 +240,11 @@ async def create_quotation(data: QuotationCreate, generate_pdf: bool = False) ->
|
||||
item_id = str(uuid.uuid4())
|
||||
await db.execute(
|
||||
"""INSERT INTO crm_quotation_items
|
||||
(id, quotation_id, product_id, description, description_en, description_gr,
|
||||
unit_type, unit_cost, discount_percent, quantity, vat_percent, line_total, sort_order)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)""",
|
||||
(id, quotation_id, product_id, description, unit_type, unit_cost,
|
||||
discount_percent, quantity, vat_percent, line_total, sort_order)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)""",
|
||||
(
|
||||
item_id, qid, item.get("product_id"), item.get("description"),
|
||||
item.get("description_en"), item.get("description_gr"),
|
||||
item.get("unit_type", "pcs"), item.get("unit_cost", 0),
|
||||
item.get("discount_percent", 0), item.get("quantity", 1),
|
||||
item.get("vat_percent", 24), item["line_total"], item.get("sort_order", i),
|
||||
@@ -291,7 +255,7 @@ async def create_quotation(data: QuotationCreate, generate_pdf: bool = False) ->
|
||||
|
||||
quotation = await get_quotation(qid)
|
||||
|
||||
if generate_pdf and not data.is_legacy:
|
||||
if generate_pdf:
|
||||
quotation = await _do_generate_and_upload_pdf(quotation)
|
||||
|
||||
return quotation
|
||||
@@ -321,7 +285,6 @@ async def update_quotation(quotation_id: str, data: QuotationUpdate, generate_pd
|
||||
"shipping_cost", "shipping_cost_discount", "install_cost",
|
||||
"install_cost_discount", "extras_label", "extras_cost",
|
||||
"client_org", "client_name", "client_location", "client_phone", "client_email",
|
||||
"legacy_date", "legacy_pdf_path",
|
||||
]
|
||||
|
||||
for field in scalar_fields:
|
||||
@@ -380,12 +343,11 @@ async def update_quotation(quotation_id: str, data: QuotationUpdate, generate_pd
|
||||
item_id = str(uuid.uuid4())
|
||||
await db.execute(
|
||||
"""INSERT INTO crm_quotation_items
|
||||
(id, quotation_id, product_id, description, description_en, description_gr,
|
||||
unit_type, unit_cost, discount_percent, quantity, vat_percent, line_total, sort_order)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)""",
|
||||
(id, quotation_id, product_id, description, unit_type, unit_cost,
|
||||
discount_percent, quantity, vat_percent, line_total, sort_order)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)""",
|
||||
(
|
||||
item_id, quotation_id, item.get("product_id"), item.get("description"),
|
||||
item.get("description_en"), item.get("description_gr"),
|
||||
item.get("unit_type", "pcs"), item.get("unit_cost", 0),
|
||||
item.get("discount_percent", 0), item.get("quantity", 1),
|
||||
item.get("vat_percent", 24), item["line_total"], item.get("sort_order", i),
|
||||
@@ -526,33 +488,7 @@ async def get_quotation_pdf_bytes(quotation_id: str) -> bytes:
|
||||
"""Download the PDF for a quotation from Nextcloud and return raw bytes."""
|
||||
from fastapi import HTTPException
|
||||
quotation = await get_quotation(quotation_id)
|
||||
# For legacy quotations, the PDF is at legacy_pdf_path
|
||||
path = quotation.legacy_pdf_path if quotation.is_legacy else quotation.nextcloud_pdf_path
|
||||
if not path:
|
||||
raise HTTPException(status_code=404, detail="No PDF available for this quotation")
|
||||
pdf_bytes, _ = await nextcloud.download_file(path)
|
||||
if not quotation.nextcloud_pdf_path:
|
||||
raise HTTPException(status_code=404, detail="No PDF generated for this quotation")
|
||||
pdf_bytes, _ = await nextcloud.download_file(quotation.nextcloud_pdf_path)
|
||||
return pdf_bytes
|
||||
|
||||
|
||||
async def upload_legacy_pdf(quotation_id: str, pdf_bytes: bytes, filename: str) -> QuotationInDB:
|
||||
"""Upload a legacy PDF to Nextcloud and store its path in the quotation record."""
|
||||
quotation = await get_quotation(quotation_id)
|
||||
if not quotation.is_legacy:
|
||||
raise HTTPException(status_code=400, detail="This quotation is not a legacy quotation")
|
||||
|
||||
from crm.service import get_customer, get_customer_nc_path
|
||||
customer = get_customer(quotation.customer_id)
|
||||
nc_folder = get_customer_nc_path(customer)
|
||||
|
||||
await nextcloud.ensure_folder(f"customers/{nc_folder}/quotations")
|
||||
rel_path = f"customers/{nc_folder}/quotations/{filename}"
|
||||
await nextcloud.upload_file(rel_path, pdf_bytes, "application/pdf")
|
||||
|
||||
db = await mqtt_db.get_db()
|
||||
now = datetime.utcnow().isoformat()
|
||||
await db.execute(
|
||||
"UPDATE crm_quotations SET legacy_pdf_path = ?, updated_at = ? WHERE id = ?",
|
||||
(rel_path, now, quotation_id),
|
||||
)
|
||||
await db.commit()
|
||||
return await get_quotation(quotation_id)
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import asyncio
|
||||
import json
|
||||
import uuid
|
||||
from datetime import datetime
|
||||
@@ -7,14 +6,13 @@ from fastapi import HTTPException
|
||||
from shared.firebase import get_db
|
||||
from shared.exceptions import NotFoundError
|
||||
import re as _re
|
||||
import database as mqtt_db
|
||||
from mqtt import database as mqtt_db
|
||||
from crm.models import (
|
||||
ProductCreate, ProductUpdate, ProductInDB,
|
||||
CustomerCreate, CustomerUpdate, CustomerInDB,
|
||||
OrderCreate, OrderUpdate, OrderInDB,
|
||||
CommCreate, CommUpdate, CommInDB,
|
||||
MediaCreate, MediaInDB,
|
||||
TechnicalIssue, InstallSupportEntry, TransactionEntry,
|
||||
)
|
||||
|
||||
COLLECTION = "crm_products"
|
||||
@@ -22,11 +20,6 @@ COLLECTION = "crm_products"
|
||||
|
||||
def _doc_to_product(doc) -> ProductInDB:
|
||||
data = doc.to_dict()
|
||||
# Backfill bilingual fields for existing products that predate the feature
|
||||
if not data.get("name_en") and data.get("name"):
|
||||
data["name_en"] = data["name"]
|
||||
if not data.get("name_gr") and data.get("name"):
|
||||
data["name_gr"] = data["name"]
|
||||
return ProductInDB(id=doc.id, **data)
|
||||
|
||||
|
||||
@@ -127,19 +120,14 @@ def delete_product(product_id: str) -> None:
|
||||
CUSTOMERS_COLLECTION = "crm_customers"
|
||||
|
||||
|
||||
_LEGACY_CUSTOMER_FIELDS = {"negotiating", "has_problem"}
|
||||
|
||||
def _doc_to_customer(doc) -> CustomerInDB:
|
||||
data = doc.to_dict()
|
||||
for f in _LEGACY_CUSTOMER_FIELDS:
|
||||
data.pop(f, None)
|
||||
return CustomerInDB(id=doc.id, **data)
|
||||
|
||||
|
||||
def list_customers(
|
||||
search: str | None = None,
|
||||
tag: str | None = None,
|
||||
sort: str | None = None,
|
||||
) -> list[CustomerInDB]:
|
||||
db = get_db()
|
||||
query = db.collection(CUSTOMERS_COLLECTION)
|
||||
@@ -153,64 +141,28 @@ def list_customers(
|
||||
|
||||
if search:
|
||||
s = search.lower()
|
||||
s_nospace = s.replace(" ", "")
|
||||
name_match = s in (customer.name or "").lower()
|
||||
surname_match = s in (customer.surname or "").lower()
|
||||
org_match = s in (customer.organization or "").lower()
|
||||
religion_match = s in (customer.religion or "").lower()
|
||||
language_match = s in (customer.language or "").lower()
|
||||
contact_match = any(
|
||||
s_nospace in (c.value or "").lower().replace(" ", "")
|
||||
or s in (c.value or "").lower()
|
||||
s in (c.value or "").lower()
|
||||
for c in (customer.contacts or [])
|
||||
)
|
||||
loc = customer.location
|
||||
loc_match = bool(loc) and (
|
||||
s in (loc.address or "").lower() or
|
||||
s in (loc.city or "").lower() or
|
||||
s in (loc.postal_code or "").lower() or
|
||||
s in (loc.region or "").lower() or
|
||||
s in (loc.country or "").lower()
|
||||
loc = customer.location or {}
|
||||
loc_match = (
|
||||
s in (loc.get("city", "") or "").lower() or
|
||||
s in (loc.get("country", "") or "").lower() or
|
||||
s in (loc.get("region", "") or "").lower()
|
||||
)
|
||||
tag_match = any(s in (t or "").lower() for t in (customer.tags or []))
|
||||
if not (name_match or surname_match or org_match or religion_match or language_match or contact_match or loc_match or tag_match):
|
||||
if not (name_match or surname_match or org_match or contact_match or loc_match or tag_match):
|
||||
continue
|
||||
|
||||
results.append(customer)
|
||||
|
||||
# Sorting (non-latest_comm; latest_comm is handled by the async router wrapper)
|
||||
_TITLES = {"fr.", "rev.", "archim.", "bp.", "abp.", "met.", "mr.", "mrs.", "ms.", "dr.", "prof."}
|
||||
|
||||
def _sort_name(c):
|
||||
return (c.name or "").lower()
|
||||
|
||||
def _sort_surname(c):
|
||||
return (c.surname or "").lower()
|
||||
|
||||
def _sort_default(c):
|
||||
return c.created_at or ""
|
||||
|
||||
if sort == "name":
|
||||
results.sort(key=_sort_name)
|
||||
elif sort == "surname":
|
||||
results.sort(key=_sort_surname)
|
||||
elif sort == "default":
|
||||
results.sort(key=_sort_default)
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def list_all_tags() -> list[str]:
|
||||
db = get_db()
|
||||
tags: set[str] = set()
|
||||
for doc in db.collection(CUSTOMERS_COLLECTION).select(["tags"]).stream():
|
||||
data = doc.to_dict()
|
||||
for tag in (data.get("tags") or []):
|
||||
if tag:
|
||||
tags.add(tag)
|
||||
return sorted(tags)
|
||||
|
||||
|
||||
def get_customer(customer_id: str) -> CustomerInDB:
|
||||
db = get_db()
|
||||
doc = db.collection(CUSTOMERS_COLLECTION).document(customer_id).get()
|
||||
@@ -254,7 +206,6 @@ def create_customer(data: CustomerCreate) -> CustomerInDB:
|
||||
|
||||
|
||||
def update_customer(customer_id: str, data: CustomerUpdate) -> CustomerInDB:
|
||||
from google.cloud.firestore_v1 import DELETE_FIELD
|
||||
db = get_db()
|
||||
doc_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
doc = doc_ref.get()
|
||||
@@ -264,108 +215,35 @@ def update_customer(customer_id: str, data: CustomerUpdate) -> CustomerInDB:
|
||||
update_data = data.model_dump(exclude_none=True)
|
||||
update_data["updated_at"] = datetime.utcnow().isoformat()
|
||||
|
||||
# Fields that should be explicitly deleted from Firestore when set to None
|
||||
# (exclude_none=True would just skip them, leaving the old value intact)
|
||||
NULLABLE_FIELDS = {"title", "surname", "organization", "religion"}
|
||||
set_fields = data.model_fields_set
|
||||
for field in NULLABLE_FIELDS:
|
||||
if field in set_fields and getattr(data, field) is None:
|
||||
update_data[field] = DELETE_FIELD
|
||||
|
||||
doc_ref.update(update_data)
|
||||
updated_doc = doc_ref.get()
|
||||
return _doc_to_customer(updated_doc)
|
||||
|
||||
|
||||
|
||||
async def get_last_comm_direction(customer_id: str) -> dict:
|
||||
"""Return direction ('inbound'/'outbound') and timestamp of the most recent comm, or None."""
|
||||
db = await mqtt_db.get_db()
|
||||
rows = await db.execute_fetchall(
|
||||
"SELECT direction, COALESCE(occurred_at, created_at) as ts FROM crm_comms_log WHERE customer_id = ? "
|
||||
"AND direction IN ('inbound', 'outbound') "
|
||||
"ORDER BY COALESCE(occurred_at, created_at) DESC, created_at DESC LIMIT 1",
|
||||
(customer_id,),
|
||||
)
|
||||
if rows:
|
||||
return {"direction": rows[0][0], "occurred_at": rows[0][1]}
|
||||
return {"direction": None, "occurred_at": None}
|
||||
|
||||
|
||||
async def get_last_comm_timestamp(customer_id: str) -> str | None:
|
||||
"""Return the ISO timestamp of the most recent comm for this customer, or None."""
|
||||
db = await mqtt_db.get_db()
|
||||
rows = await db.execute_fetchall(
|
||||
"SELECT COALESCE(occurred_at, created_at) as ts FROM crm_comms_log "
|
||||
"WHERE customer_id = ? ORDER BY ts DESC LIMIT 1",
|
||||
(customer_id,),
|
||||
)
|
||||
if rows:
|
||||
return rows[0][0]
|
||||
return None
|
||||
|
||||
|
||||
async def list_customers_sorted_by_latest_comm(customers: list[CustomerInDB]) -> list[CustomerInDB]:
|
||||
"""Re-sort a list of customers so those with the most recent comm come first."""
|
||||
timestamps = await asyncio.gather(
|
||||
*[get_last_comm_timestamp(c.id) for c in customers]
|
||||
)
|
||||
paired = list(zip(customers, timestamps))
|
||||
paired.sort(key=lambda x: x[1] or "", reverse=True)
|
||||
return [c for c, _ in paired]
|
||||
|
||||
|
||||
def delete_customer(customer_id: str) -> CustomerInDB:
|
||||
"""Delete customer from Firestore. Returns the customer data (for NC path lookup)."""
|
||||
def delete_customer(customer_id: str) -> None:
|
||||
db = get_db()
|
||||
doc_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists:
|
||||
raise NotFoundError("Customer")
|
||||
customer = _doc_to_customer(doc)
|
||||
doc_ref.delete()
|
||||
return customer
|
||||
|
||||
|
||||
async def delete_customer_comms(customer_id: str) -> int:
|
||||
"""Delete all comm log entries for a customer. Returns count deleted."""
|
||||
db = await mqtt_db.get_db()
|
||||
cursor = await db.execute(
|
||||
"DELETE FROM crm_comms_log WHERE customer_id = ?", (customer_id,)
|
||||
)
|
||||
await db.commit()
|
||||
return cursor.rowcount
|
||||
# ── Orders ───────────────────────────────────────────────────────────────────
|
||||
|
||||
ORDERS_COLLECTION = "crm_orders"
|
||||
|
||||
async def delete_customer_media_entries(customer_id: str) -> int:
|
||||
"""Delete all media DB entries for a customer. Returns count deleted."""
|
||||
db = await mqtt_db.get_db()
|
||||
cursor = await db.execute(
|
||||
"DELETE FROM crm_media WHERE customer_id = ?", (customer_id,)
|
||||
)
|
||||
await db.commit()
|
||||
return cursor.rowcount
|
||||
|
||||
|
||||
# ── Orders (subcollection under customers/{id}/orders) ────────────────────────
|
||||
|
||||
def _doc_to_order(doc) -> OrderInDB:
|
||||
data = doc.to_dict()
|
||||
return OrderInDB(id=doc.id, **data)
|
||||
|
||||
|
||||
def _order_collection(customer_id: str):
|
||||
db = get_db()
|
||||
return db.collection(CUSTOMERS_COLLECTION).document(customer_id).collection("orders")
|
||||
|
||||
|
||||
def _generate_order_number(customer_id: str) -> str:
|
||||
"""Generate next ORD-DDMMYY-NNN across all customers using collection group query."""
|
||||
db = get_db()
|
||||
now = datetime.utcnow()
|
||||
prefix = f"ORD-{now.strftime('%d%m%y')}-"
|
||||
def _generate_order_number(db) -> str:
|
||||
year = datetime.utcnow().year
|
||||
prefix = f"ORD-{year}-"
|
||||
max_n = 0
|
||||
for doc in db.collection_group("orders").stream():
|
||||
for doc in db.collection(ORDERS_COLLECTION).stream():
|
||||
data = doc.to_dict()
|
||||
num = data.get("order_number", "")
|
||||
if num and num.startswith(prefix):
|
||||
@@ -378,150 +256,50 @@ def _generate_order_number(customer_id: str) -> str:
|
||||
return f"{prefix}{max_n + 1:03d}"
|
||||
|
||||
|
||||
def _default_payment_status() -> dict:
|
||||
return {
|
||||
"required_amount": 0,
|
||||
"received_amount": 0,
|
||||
"balance_due": 0,
|
||||
"advance_required": False,
|
||||
"advance_amount": None,
|
||||
"payment_complete": False,
|
||||
}
|
||||
|
||||
|
||||
def _recalculate_order_payment_status(customer_id: str, order_id: str) -> None:
|
||||
"""Recompute an order's payment_status from transaction_history on the customer."""
|
||||
def list_orders(
|
||||
customer_id: str | None = None,
|
||||
status: str | None = None,
|
||||
payment_status: str | None = None,
|
||||
) -> list[OrderInDB]:
|
||||
db = get_db()
|
||||
cust_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
cust_data = (cust_ref.get().to_dict()) or {}
|
||||
txns = cust_data.get("transaction_history") or []
|
||||
required = sum(float(t.get("amount") or 0) for t in txns
|
||||
if t.get("order_ref") == order_id and t.get("flow") == "invoice")
|
||||
received = sum(float(t.get("amount") or 0) for t in txns
|
||||
if t.get("order_ref") == order_id and t.get("flow") == "payment")
|
||||
balance_due = required - received
|
||||
payment_complete = (required > 0 and balance_due <= 0)
|
||||
order_ref = _order_collection(customer_id).document(order_id)
|
||||
if not order_ref.get().exists:
|
||||
return
|
||||
order_ref.update({
|
||||
"payment_status": {
|
||||
"required_amount": required,
|
||||
"received_amount": received,
|
||||
"balance_due": balance_due,
|
||||
"advance_required": False,
|
||||
"advance_amount": None,
|
||||
"payment_complete": payment_complete,
|
||||
},
|
||||
"updated_at": datetime.utcnow().isoformat(),
|
||||
})
|
||||
query = db.collection(ORDERS_COLLECTION)
|
||||
|
||||
|
||||
def _update_crm_summary(customer_id: str) -> None:
|
||||
"""Recompute and store the crm_summary field on the customer document."""
|
||||
db = get_db()
|
||||
customer_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
|
||||
# Load customer for issue/support arrays
|
||||
customer_doc = customer_ref.get()
|
||||
if not customer_doc.exists:
|
||||
return
|
||||
customer_data = customer_doc.to_dict() or {}
|
||||
|
||||
# Active issues
|
||||
issues = customer_data.get("technical_issues") or []
|
||||
active_issues = [i for i in issues if i.get("active")]
|
||||
active_issues_count = len(active_issues)
|
||||
latest_issue_date = None
|
||||
if active_issues:
|
||||
latest_issue_date = max((i.get("opened_date") or "") for i in active_issues) or None
|
||||
|
||||
# Active support
|
||||
support = customer_data.get("install_support") or []
|
||||
active_support = [s for s in support if s.get("active")]
|
||||
active_support_count = len(active_support)
|
||||
latest_support_date = None
|
||||
if active_support:
|
||||
latest_support_date = max((s.get("opened_date") or "") for s in active_support) or None
|
||||
|
||||
# Active order (most recent non-terminal status)
|
||||
TERMINAL_STATUSES = {"declined", "complete"}
|
||||
active_order_status = None
|
||||
active_order_status_date = None
|
||||
active_order_title = None
|
||||
active_order_number = None
|
||||
latest_order_date = ""
|
||||
all_order_statuses = []
|
||||
for doc in _order_collection(customer_id).stream():
|
||||
data = doc.to_dict() or {}
|
||||
status = data.get("status", "")
|
||||
all_order_statuses.append(status)
|
||||
if status not in TERMINAL_STATUSES:
|
||||
upd = data.get("status_updated_date") or data.get("created_at") or ""
|
||||
if upd > latest_order_date:
|
||||
latest_order_date = upd
|
||||
active_order_status = status
|
||||
active_order_status_date = upd
|
||||
active_order_title = data.get("title")
|
||||
active_order_number = data.get("order_number")
|
||||
|
||||
summary = {
|
||||
"active_order_status": active_order_status,
|
||||
"active_order_status_date": active_order_status_date,
|
||||
"active_order_title": active_order_title,
|
||||
"active_order_number": active_order_number,
|
||||
"all_orders_statuses": all_order_statuses,
|
||||
"active_issues_count": active_issues_count,
|
||||
"latest_issue_date": latest_issue_date,
|
||||
"active_support_count": active_support_count,
|
||||
"latest_support_date": latest_support_date,
|
||||
}
|
||||
customer_ref.update({"crm_summary": summary, "updated_at": datetime.utcnow().isoformat()})
|
||||
|
||||
|
||||
def list_orders(customer_id: str) -> list[OrderInDB]:
|
||||
return [_doc_to_order(doc) for doc in _order_collection(customer_id).stream()]
|
||||
|
||||
|
||||
def list_all_orders(status: str | None = None) -> list[OrderInDB]:
|
||||
"""Query across all customers using Firestore collection group."""
|
||||
db = get_db()
|
||||
query = db.collection_group("orders")
|
||||
if customer_id:
|
||||
query = query.where("customer_id", "==", customer_id)
|
||||
if status:
|
||||
query = query.where("status", "==", status)
|
||||
if payment_status:
|
||||
query = query.where("payment_status", "==", payment_status)
|
||||
|
||||
return [_doc_to_order(doc) for doc in query.stream()]
|
||||
|
||||
|
||||
def get_order(customer_id: str, order_id: str) -> OrderInDB:
|
||||
doc = _order_collection(customer_id).document(order_id).get()
|
||||
def get_order(order_id: str) -> OrderInDB:
|
||||
db = get_db()
|
||||
doc = db.collection(ORDERS_COLLECTION).document(order_id).get()
|
||||
if not doc.exists:
|
||||
raise NotFoundError("Order")
|
||||
return _doc_to_order(doc)
|
||||
|
||||
|
||||
def create_order(customer_id: str, data: OrderCreate) -> OrderInDB:
|
||||
col = _order_collection(customer_id)
|
||||
def create_order(data: OrderCreate) -> OrderInDB:
|
||||
db = get_db()
|
||||
now = datetime.utcnow().isoformat()
|
||||
order_id = str(uuid.uuid4())
|
||||
|
||||
doc_data = data.model_dump()
|
||||
doc_data["customer_id"] = customer_id
|
||||
if not doc_data.get("order_number"):
|
||||
doc_data["order_number"] = _generate_order_number(customer_id)
|
||||
if not doc_data.get("payment_status"):
|
||||
doc_data["payment_status"] = _default_payment_status()
|
||||
if not doc_data.get("status_updated_date"):
|
||||
doc_data["status_updated_date"] = now
|
||||
doc_data["order_number"] = _generate_order_number(db)
|
||||
doc_data["created_at"] = now
|
||||
doc_data["updated_at"] = now
|
||||
|
||||
col.document(order_id).set(doc_data)
|
||||
_update_crm_summary(customer_id)
|
||||
db.collection(ORDERS_COLLECTION).document(order_id).set(doc_data)
|
||||
return OrderInDB(id=order_id, **doc_data)
|
||||
|
||||
|
||||
def update_order(customer_id: str, order_id: str, data: OrderUpdate) -> OrderInDB:
|
||||
doc_ref = _order_collection(customer_id).document(order_id)
|
||||
def update_order(order_id: str, data: OrderUpdate) -> OrderInDB:
|
||||
db = get_db()
|
||||
doc_ref = db.collection(ORDERS_COLLECTION).document(order_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists:
|
||||
raise NotFoundError("Order")
|
||||
@@ -530,362 +308,17 @@ def update_order(customer_id: str, order_id: str, data: OrderUpdate) -> OrderInD
|
||||
update_data["updated_at"] = datetime.utcnow().isoformat()
|
||||
|
||||
doc_ref.update(update_data)
|
||||
_update_crm_summary(customer_id)
|
||||
result = _doc_to_order(doc_ref.get())
|
||||
updated_doc = doc_ref.get()
|
||||
return _doc_to_order(updated_doc)
|
||||
|
||||
# Auto-mark customer as inactive when all orders are complete
|
||||
if update_data.get("status") == "complete":
|
||||
all_orders = list_orders(customer_id)
|
||||
if all_orders and all(o.status == "complete" for o in all_orders):
|
||||
|
||||
def delete_order(order_id: str) -> None:
|
||||
db = get_db()
|
||||
db.collection(CUSTOMERS_COLLECTION).document(customer_id).update({
|
||||
"relationship_status": "inactive",
|
||||
"updated_at": datetime.utcnow().isoformat(),
|
||||
})
|
||||
|
||||
return result
|
||||
|
||||
|
||||
def delete_order(customer_id: str, order_id: str) -> None:
|
||||
doc_ref = _order_collection(customer_id).document(order_id)
|
||||
if not doc_ref.get().exists:
|
||||
doc_ref = db.collection(ORDERS_COLLECTION).document(order_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists:
|
||||
raise NotFoundError("Order")
|
||||
doc_ref.delete()
|
||||
_update_crm_summary(customer_id)
|
||||
|
||||
|
||||
def append_timeline_event(customer_id: str, order_id: str, event: dict) -> OrderInDB:
|
||||
from google.cloud.firestore_v1 import ArrayUnion
|
||||
doc_ref = _order_collection(customer_id).document(order_id)
|
||||
if not doc_ref.get().exists:
|
||||
raise NotFoundError("Order")
|
||||
now = datetime.utcnow().isoformat()
|
||||
doc_ref.update({
|
||||
"timeline": ArrayUnion([event]),
|
||||
"status_updated_date": event.get("date", now),
|
||||
"status_updated_by": event.get("updated_by", ""),
|
||||
"updated_at": now,
|
||||
})
|
||||
return _doc_to_order(doc_ref.get())
|
||||
|
||||
|
||||
def update_timeline_event(customer_id: str, order_id: str, index: int, data: dict) -> OrderInDB:
|
||||
doc_ref = _order_collection(customer_id).document(order_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists:
|
||||
raise NotFoundError("Order")
|
||||
timeline = list(doc.to_dict().get("timeline") or [])
|
||||
if index < 0 or index >= len(timeline):
|
||||
raise HTTPException(status_code=404, detail="Timeline index out of range")
|
||||
timeline[index] = {**timeline[index], **data}
|
||||
doc_ref.update({"timeline": timeline, "updated_at": datetime.utcnow().isoformat()})
|
||||
return _doc_to_order(doc_ref.get())
|
||||
|
||||
|
||||
def delete_timeline_event(customer_id: str, order_id: str, index: int) -> OrderInDB:
|
||||
doc_ref = _order_collection(customer_id).document(order_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists:
|
||||
raise NotFoundError("Order")
|
||||
timeline = list(doc.to_dict().get("timeline") or [])
|
||||
if index < 0 or index >= len(timeline):
|
||||
raise HTTPException(status_code=404, detail="Timeline index out of range")
|
||||
timeline.pop(index)
|
||||
doc_ref.update({"timeline": timeline, "updated_at": datetime.utcnow().isoformat()})
|
||||
return _doc_to_order(doc_ref.get())
|
||||
|
||||
|
||||
def update_order_payment_status(customer_id: str, order_id: str, payment_data: dict) -> OrderInDB:
|
||||
doc_ref = _order_collection(customer_id).document(order_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists:
|
||||
raise NotFoundError("Order")
|
||||
existing = doc.to_dict().get("payment_status") or _default_payment_status()
|
||||
existing.update({k: v for k, v in payment_data.items() if v is not None})
|
||||
doc_ref.update({
|
||||
"payment_status": existing,
|
||||
"updated_at": datetime.utcnow().isoformat(),
|
||||
})
|
||||
return _doc_to_order(doc_ref.get())
|
||||
|
||||
|
||||
def init_negotiations(customer_id: str, title: str, note: str, date: str, created_by: str) -> OrderInDB:
|
||||
"""Create a new order with status=negotiating and bump customer relationship_status if needed."""
|
||||
db = get_db()
|
||||
customer_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
customer_doc = customer_ref.get()
|
||||
if not customer_doc.exists:
|
||||
raise NotFoundError("Customer")
|
||||
|
||||
now = datetime.utcnow().isoformat()
|
||||
order_id = str(uuid.uuid4())
|
||||
|
||||
timeline_event = {
|
||||
"date": date or now,
|
||||
"type": "note",
|
||||
"note": note or "",
|
||||
"updated_by": created_by,
|
||||
}
|
||||
|
||||
doc_data = {
|
||||
"customer_id": customer_id,
|
||||
"order_number": _generate_order_number(customer_id),
|
||||
"title": title,
|
||||
"created_by": created_by,
|
||||
"status": "negotiating",
|
||||
"status_updated_date": date or now,
|
||||
"status_updated_by": created_by,
|
||||
"items": [],
|
||||
"subtotal": 0,
|
||||
"discount": None,
|
||||
"total_price": 0,
|
||||
"currency": "EUR",
|
||||
"shipping": None,
|
||||
"payment_status": _default_payment_status(),
|
||||
"invoice_path": None,
|
||||
"notes": note or "",
|
||||
"timeline": [timeline_event],
|
||||
"created_at": now,
|
||||
"updated_at": now,
|
||||
}
|
||||
|
||||
_order_collection(customer_id).document(order_id).set(doc_data)
|
||||
|
||||
# Upgrade relationship_status only if currently lead or prospect
|
||||
current_data = customer_doc.to_dict() or {}
|
||||
current_rel = current_data.get("relationship_status", "lead")
|
||||
if current_rel in ("lead", "prospect"):
|
||||
customer_ref.update({"relationship_status": "active", "updated_at": now})
|
||||
|
||||
_update_crm_summary(customer_id)
|
||||
return OrderInDB(id=order_id, **doc_data)
|
||||
|
||||
|
||||
# ── Technical Issues & Install Support ────────────────────────────────────────
|
||||
|
||||
def add_technical_issue(customer_id: str, note: str, opened_by: str, date: str | None = None) -> CustomerInDB:
|
||||
from google.cloud.firestore_v1 import ArrayUnion
|
||||
db = get_db()
|
||||
doc_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
if not doc_ref.get().exists:
|
||||
raise NotFoundError("Customer")
|
||||
now = datetime.utcnow().isoformat()
|
||||
issue = {
|
||||
"active": True,
|
||||
"opened_date": date or now,
|
||||
"resolved_date": None,
|
||||
"note": note,
|
||||
"opened_by": opened_by,
|
||||
"resolved_by": None,
|
||||
}
|
||||
doc_ref.update({"technical_issues": ArrayUnion([issue]), "updated_at": now})
|
||||
_update_crm_summary(customer_id)
|
||||
return _doc_to_customer(doc_ref.get())
|
||||
|
||||
|
||||
def resolve_technical_issue(customer_id: str, index: int, resolved_by: str) -> CustomerInDB:
|
||||
db = get_db()
|
||||
doc_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists:
|
||||
raise NotFoundError("Customer")
|
||||
data = doc.to_dict() or {}
|
||||
issues = list(data.get("technical_issues") or [])
|
||||
if index < 0 or index >= len(issues):
|
||||
raise HTTPException(status_code=404, detail="Issue index out of range")
|
||||
now = datetime.utcnow().isoformat()
|
||||
issues[index] = {**issues[index], "active": False, "resolved_date": now, "resolved_by": resolved_by}
|
||||
doc_ref.update({"technical_issues": issues, "updated_at": now})
|
||||
_update_crm_summary(customer_id)
|
||||
return _doc_to_customer(doc_ref.get())
|
||||
|
||||
|
||||
def add_install_support(customer_id: str, note: str, opened_by: str, date: str | None = None) -> CustomerInDB:
|
||||
from google.cloud.firestore_v1 import ArrayUnion
|
||||
db = get_db()
|
||||
doc_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
if not doc_ref.get().exists:
|
||||
raise NotFoundError("Customer")
|
||||
now = datetime.utcnow().isoformat()
|
||||
entry = {
|
||||
"active": True,
|
||||
"opened_date": date or now,
|
||||
"resolved_date": None,
|
||||
"note": note,
|
||||
"opened_by": opened_by,
|
||||
"resolved_by": None,
|
||||
}
|
||||
doc_ref.update({"install_support": ArrayUnion([entry]), "updated_at": now})
|
||||
_update_crm_summary(customer_id)
|
||||
return _doc_to_customer(doc_ref.get())
|
||||
|
||||
|
||||
def resolve_install_support(customer_id: str, index: int, resolved_by: str) -> CustomerInDB:
|
||||
db = get_db()
|
||||
doc_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists:
|
||||
raise NotFoundError("Customer")
|
||||
data = doc.to_dict() or {}
|
||||
entries = list(data.get("install_support") or [])
|
||||
if index < 0 or index >= len(entries):
|
||||
raise HTTPException(status_code=404, detail="Support index out of range")
|
||||
now = datetime.utcnow().isoformat()
|
||||
entries[index] = {**entries[index], "active": False, "resolved_date": now, "resolved_by": resolved_by}
|
||||
doc_ref.update({"install_support": entries, "updated_at": now})
|
||||
_update_crm_summary(customer_id)
|
||||
return _doc_to_customer(doc_ref.get())
|
||||
|
||||
|
||||
def edit_technical_issue(customer_id: str, index: int, note: str, opened_date: str | None = None) -> CustomerInDB:
|
||||
db = get_db()
|
||||
doc_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists:
|
||||
raise NotFoundError("Customer")
|
||||
data = doc.to_dict() or {}
|
||||
issues = list(data.get("technical_issues") or [])
|
||||
if index < 0 or index >= len(issues):
|
||||
raise HTTPException(status_code=404, detail="Issue index out of range")
|
||||
issues[index] = {**issues[index], "note": note}
|
||||
if opened_date:
|
||||
issues[index]["opened_date"] = opened_date
|
||||
doc_ref.update({"technical_issues": issues, "updated_at": datetime.utcnow().isoformat()})
|
||||
_update_crm_summary(customer_id)
|
||||
return _doc_to_customer(doc_ref.get())
|
||||
|
||||
|
||||
def delete_technical_issue(customer_id: str, index: int) -> CustomerInDB:
|
||||
db = get_db()
|
||||
doc_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists:
|
||||
raise NotFoundError("Customer")
|
||||
data = doc.to_dict() or {}
|
||||
issues = list(data.get("technical_issues") or [])
|
||||
if index < 0 or index >= len(issues):
|
||||
raise HTTPException(status_code=404, detail="Issue index out of range")
|
||||
issues.pop(index)
|
||||
doc_ref.update({"technical_issues": issues, "updated_at": datetime.utcnow().isoformat()})
|
||||
_update_crm_summary(customer_id)
|
||||
return _doc_to_customer(doc_ref.get())
|
||||
|
||||
|
||||
def edit_install_support(customer_id: str, index: int, note: str, opened_date: str | None = None) -> CustomerInDB:
|
||||
db = get_db()
|
||||
doc_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists:
|
||||
raise NotFoundError("Customer")
|
||||
data = doc.to_dict() or {}
|
||||
entries = list(data.get("install_support") or [])
|
||||
if index < 0 or index >= len(entries):
|
||||
raise HTTPException(status_code=404, detail="Support index out of range")
|
||||
entries[index] = {**entries[index], "note": note}
|
||||
if opened_date:
|
||||
entries[index]["opened_date"] = opened_date
|
||||
doc_ref.update({"install_support": entries, "updated_at": datetime.utcnow().isoformat()})
|
||||
_update_crm_summary(customer_id)
|
||||
return _doc_to_customer(doc_ref.get())
|
||||
|
||||
|
||||
def delete_install_support(customer_id: str, index: int) -> CustomerInDB:
|
||||
db = get_db()
|
||||
doc_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists:
|
||||
raise NotFoundError("Customer")
|
||||
data = doc.to_dict() or {}
|
||||
entries = list(data.get("install_support") or [])
|
||||
if index < 0 or index >= len(entries):
|
||||
raise HTTPException(status_code=404, detail="Support index out of range")
|
||||
entries.pop(index)
|
||||
doc_ref.update({"install_support": entries, "updated_at": datetime.utcnow().isoformat()})
|
||||
_update_crm_summary(customer_id)
|
||||
return _doc_to_customer(doc_ref.get())
|
||||
|
||||
|
||||
# ── Transactions ──────────────────────────────────────────────────────────────
|
||||
|
||||
def add_transaction(customer_id: str, entry: TransactionEntry) -> CustomerInDB:
|
||||
from google.cloud.firestore_v1 import ArrayUnion
|
||||
db = get_db()
|
||||
doc_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
if not doc_ref.get().exists:
|
||||
raise NotFoundError("Customer")
|
||||
now = datetime.utcnow().isoformat()
|
||||
doc_ref.update({"transaction_history": ArrayUnion([entry.model_dump()]), "updated_at": now})
|
||||
if entry.order_ref:
|
||||
_recalculate_order_payment_status(customer_id, entry.order_ref)
|
||||
return _doc_to_customer(doc_ref.get())
|
||||
|
||||
|
||||
def update_transaction(customer_id: str, index: int, entry: TransactionEntry) -> CustomerInDB:
|
||||
db = get_db()
|
||||
doc_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists:
|
||||
raise NotFoundError("Customer")
|
||||
data = doc.to_dict() or {}
|
||||
txns = list(data.get("transaction_history") or [])
|
||||
if index < 0 or index >= len(txns):
|
||||
raise HTTPException(status_code=404, detail="Transaction index out of range")
|
||||
txns[index] = entry.model_dump()
|
||||
now = datetime.utcnow().isoformat()
|
||||
doc_ref.update({"transaction_history": txns, "updated_at": now})
|
||||
if entry.order_ref:
|
||||
_recalculate_order_payment_status(customer_id, entry.order_ref)
|
||||
return _doc_to_customer(doc_ref.get())
|
||||
|
||||
|
||||
def delete_transaction(customer_id: str, index: int) -> CustomerInDB:
|
||||
db = get_db()
|
||||
doc_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists:
|
||||
raise NotFoundError("Customer")
|
||||
data = doc.to_dict() or {}
|
||||
txns = list(data.get("transaction_history") or [])
|
||||
if index < 0 or index >= len(txns):
|
||||
raise HTTPException(status_code=404, detail="Transaction index out of range")
|
||||
deleted_order_ref = txns[index].get("order_ref")
|
||||
txns.pop(index)
|
||||
now = datetime.utcnow().isoformat()
|
||||
doc_ref.update({"transaction_history": txns, "updated_at": now})
|
||||
if deleted_order_ref:
|
||||
_recalculate_order_payment_status(customer_id, deleted_order_ref)
|
||||
return _doc_to_customer(doc_ref.get())
|
||||
|
||||
|
||||
# ── Relationship Status ───────────────────────────────────────────────────────
|
||||
|
||||
def update_relationship_status(customer_id: str, status: str) -> CustomerInDB:
|
||||
VALID = {"lead", "prospect", "active", "inactive", "churned"}
|
||||
if status not in VALID:
|
||||
raise HTTPException(status_code=422, detail=f"Invalid relationship_status: {status}")
|
||||
db = get_db()
|
||||
doc_ref = db.collection(CUSTOMERS_COLLECTION).document(customer_id)
|
||||
if not doc_ref.get().exists:
|
||||
raise NotFoundError("Customer")
|
||||
|
||||
# Failsafe: cannot manually mark inactive if open (non-terminal) orders exist
|
||||
if status == "inactive":
|
||||
TERMINAL = {"declined", "complete"}
|
||||
open_orders = [
|
||||
doc for doc in _order_collection(customer_id).stream()
|
||||
if (doc.to_dict() or {}).get("status", "") not in TERMINAL
|
||||
]
|
||||
if open_orders:
|
||||
raise HTTPException(
|
||||
status_code=409,
|
||||
detail=(
|
||||
f"Cannot mark as inactive: {len(open_orders)} open order(s) still exist. "
|
||||
"Please resolve all orders before changing the status."
|
||||
),
|
||||
)
|
||||
|
||||
doc_ref.update({"relationship_status": status, "updated_at": datetime.utcnow().isoformat()})
|
||||
return _doc_to_customer(doc_ref.get())
|
||||
|
||||
|
||||
# ── Comms Log (SQLite, async) ─────────────────────────────────────────────────
|
||||
@@ -1161,11 +594,11 @@ async def create_media(data: MediaCreate) -> MediaInDB:
|
||||
await db.execute(
|
||||
"""INSERT INTO crm_media
|
||||
(id, customer_id, order_id, filename, nextcloud_path, mime_type,
|
||||
direction, tags, uploaded_by, thumbnail_path, created_at)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)""",
|
||||
direction, tags, uploaded_by, created_at)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)""",
|
||||
(media_id, data.customer_id, data.order_id, data.filename,
|
||||
data.nextcloud_path, data.mime_type, direction,
|
||||
tags_json, data.uploaded_by, data.thumbnail_path, now),
|
||||
tags_json, data.uploaded_by, now),
|
||||
)
|
||||
await db.commit()
|
||||
|
||||
@@ -1184,65 +617,3 @@ async def delete_media(media_id: str) -> None:
|
||||
raise HTTPException(status_code=404, detail="Media entry not found")
|
||||
await db.execute("DELETE FROM crm_media WHERE id = ?", (media_id,))
|
||||
await db.commit()
|
||||
|
||||
|
||||
# ── Background polling ────────────────────────────────────────────────────────
|
||||
|
||||
PRE_MFG_STATUSES = {"negotiating", "awaiting_quotation", "awaiting_customer_confirmation", "awaiting_fulfilment", "awaiting_payment"}
|
||||
TERMINAL_STATUSES = {"declined", "complete"}
|
||||
|
||||
|
||||
def poll_crm_customer_statuses() -> None:
|
||||
"""
|
||||
Two checks run daily:
|
||||
|
||||
1. Active + open pre-mfg order + 12+ months since last comm → churn.
|
||||
2. Inactive + has any open (non-terminal) order → flip back to active.
|
||||
"""
|
||||
db = get_db()
|
||||
now = datetime.utcnow()
|
||||
|
||||
for doc in db.collection(CUSTOMERS_COLLECTION).stream():
|
||||
try:
|
||||
data = doc.to_dict() or {}
|
||||
rel_status = data.get("relationship_status", "lead")
|
||||
summary = data.get("crm_summary") or {}
|
||||
all_statuses = summary.get("all_orders_statuses") or []
|
||||
|
||||
# ── Check 1: active + silent 12 months on a pre-mfg order → churned ──
|
||||
if rel_status == "active":
|
||||
has_open_pre_mfg = any(s in PRE_MFG_STATUSES for s in all_statuses)
|
||||
if not has_open_pre_mfg:
|
||||
continue
|
||||
|
||||
# Find last comm date from SQLite comms table
|
||||
# (comms are stored in SQLite, keyed by customer_id)
|
||||
# We rely on crm_summary not having this; use Firestore comms subcollection as fallback
|
||||
# The last_comm_date is passed from the frontend; here we use the comms subcollection
|
||||
comms = list(db.collection(CUSTOMERS_COLLECTION).document(doc.id).collection("comms").stream())
|
||||
if not comms:
|
||||
continue
|
||||
latest_date_str = max((c.to_dict().get("date") or "") for c in comms)
|
||||
if not latest_date_str:
|
||||
continue
|
||||
last_contact = datetime.fromisoformat(latest_date_str.rstrip("Z").split("+")[0])
|
||||
days_since = (now - last_contact).days
|
||||
if days_since >= 365:
|
||||
db.collection(CUSTOMERS_COLLECTION).document(doc.id).update({
|
||||
"relationship_status": "churned",
|
||||
"updated_at": now.isoformat(),
|
||||
})
|
||||
print(f"[CRM POLL] {doc.id} → churned ({days_since}d silent, open pre-mfg order)")
|
||||
|
||||
# ── Check 2: inactive + open orders exist → flip back to active ──
|
||||
elif rel_status == "inactive":
|
||||
has_open = any(s not in TERMINAL_STATUSES for s in all_statuses)
|
||||
if has_open:
|
||||
db.collection(CUSTOMERS_COLLECTION).document(doc.id).update({
|
||||
"relationship_status": "active",
|
||||
"updated_at": now.isoformat(),
|
||||
})
|
||||
print(f"[CRM POLL] {doc.id} → active (inactive but has open orders)")
|
||||
|
||||
except Exception as e:
|
||||
print(f"[CRM POLL] Error processing customer {doc.id}: {e}")
|
||||
|
||||
@@ -1,125 +0,0 @@
|
||||
"""
|
||||
Thumbnail generation for uploaded media files.
|
||||
|
||||
Supports:
|
||||
- Images (via Pillow): JPEG thumbnail at 300×300 max
|
||||
- Videos (via ffmpeg subprocess): extract first frame as JPEG
|
||||
- PDFs (via pdf2image + Poppler): render first page as JPEG
|
||||
|
||||
Returns None if the type is unsupported or if generation fails.
|
||||
"""
|
||||
import io
|
||||
import logging
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
THUMB_SIZE = (220, 220) # small enough for gallery tiles; keeps files ~4-6 KB
|
||||
|
||||
|
||||
def _thumb_from_image(content: bytes) -> bytes | None:
|
||||
try:
|
||||
from PIL import Image, ImageOps
|
||||
img = Image.open(io.BytesIO(content))
|
||||
img = ImageOps.exif_transpose(img) # honour EXIF Orientation tag before resizing
|
||||
img = img.convert("RGB")
|
||||
img.thumbnail(THUMB_SIZE, Image.LANCZOS)
|
||||
out = io.BytesIO()
|
||||
# quality=55 + optimize=True + progressive encoding → ~4-6 KB for typical photos
|
||||
img.save(out, format="JPEG", quality=65, optimize=True, progressive=True)
|
||||
return out.getvalue()
|
||||
except Exception as e:
|
||||
logger.warning("Image thumbnail failed: %s", e)
|
||||
return None
|
||||
|
||||
|
||||
def _thumb_from_video(content: bytes) -> bytes | None:
|
||||
"""
|
||||
Extract the first frame of a video as a JPEG thumbnail.
|
||||
|
||||
We write the video to a temp file instead of piping it to ffmpeg because
|
||||
most video containers (MP4, MOV, MKV …) store their index (moov atom) at
|
||||
an arbitrary offset and ffmpeg cannot seek on a pipe — causing rc≠0 with
|
||||
"moov atom not found" or similar errors when stdin is used.
|
||||
"""
|
||||
import tempfile
|
||||
import os
|
||||
try:
|
||||
# Write to a temp file so ffmpeg can seek freely
|
||||
with tempfile.NamedTemporaryFile(suffix=".video", delete=False) as tmp_in:
|
||||
tmp_in.write(content)
|
||||
tmp_in_path = tmp_in.name
|
||||
|
||||
with tempfile.NamedTemporaryFile(suffix=".jpg", delete=False) as tmp_out:
|
||||
tmp_out_path = tmp_out.name
|
||||
|
||||
try:
|
||||
result = subprocess.run(
|
||||
[
|
||||
"ffmpeg", "-y",
|
||||
"-i", tmp_in_path,
|
||||
"-vframes", "1",
|
||||
"-vf", f"scale={THUMB_SIZE[0]}:-2",
|
||||
"-q:v", "4", # JPEG quality 1-31 (lower = better); 4 ≈ ~80% quality
|
||||
tmp_out_path,
|
||||
],
|
||||
capture_output=True,
|
||||
timeout=60,
|
||||
)
|
||||
if result.returncode == 0 and os.path.getsize(tmp_out_path) > 0:
|
||||
with open(tmp_out_path, "rb") as f:
|
||||
return f.read()
|
||||
logger.warning(
|
||||
"ffmpeg video thumb failed (rc=%s): %s",
|
||||
result.returncode,
|
||||
result.stderr[-400:].decode(errors="replace") if result.stderr else "",
|
||||
)
|
||||
return None
|
||||
finally:
|
||||
os.unlink(tmp_in_path)
|
||||
try:
|
||||
os.unlink(tmp_out_path)
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
except FileNotFoundError:
|
||||
logger.warning("ffmpeg not found — video thumbnails unavailable")
|
||||
return None
|
||||
except Exception as e:
|
||||
logger.warning("Video thumbnail failed: %s", e)
|
||||
return None
|
||||
|
||||
|
||||
def _thumb_from_pdf(content: bytes) -> bytes | None:
|
||||
try:
|
||||
from pdf2image import convert_from_bytes
|
||||
pages = convert_from_bytes(content, first_page=1, last_page=1, size=THUMB_SIZE)
|
||||
if not pages:
|
||||
return None
|
||||
out = io.BytesIO()
|
||||
pages[0].save(out, format="JPEG", quality=55, optimize=True, progressive=True)
|
||||
return out.getvalue()
|
||||
except ImportError:
|
||||
logger.warning("pdf2image not installed — PDF thumbnails unavailable")
|
||||
return None
|
||||
except Exception as e:
|
||||
logger.warning("PDF thumbnail failed: %s", e)
|
||||
return None
|
||||
|
||||
|
||||
def generate_thumbnail(content: bytes, mime_type: str, filename: str) -> bytes | None:
|
||||
"""
|
||||
Generate a small JPEG thumbnail for the given file content.
|
||||
Returns JPEG bytes or None if unsupported / generation fails.
|
||||
"""
|
||||
mt = (mime_type or "").lower()
|
||||
fn = (filename or "").lower()
|
||||
|
||||
if mt.startswith("image/"):
|
||||
return _thumb_from_image(content)
|
||||
if mt.startswith("video/"):
|
||||
return _thumb_from_video(content)
|
||||
if mt == "application/pdf" or fn.endswith(".pdf"):
|
||||
return _thumb_from_pdf(content)
|
||||
|
||||
return None
|
||||
@@ -1,39 +0,0 @@
|
||||
from database.core import (
|
||||
init_db,
|
||||
close_db,
|
||||
get_db,
|
||||
purge_loop,
|
||||
purge_old_data,
|
||||
insert_log,
|
||||
insert_heartbeat,
|
||||
insert_command,
|
||||
update_command_response,
|
||||
get_logs,
|
||||
get_heartbeats,
|
||||
get_commands,
|
||||
get_latest_heartbeats,
|
||||
get_pending_command,
|
||||
upsert_alert,
|
||||
delete_alert,
|
||||
get_alerts,
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
"init_db",
|
||||
"close_db",
|
||||
"get_db",
|
||||
"purge_loop",
|
||||
"purge_old_data",
|
||||
"insert_log",
|
||||
"insert_heartbeat",
|
||||
"insert_command",
|
||||
"update_command_response",
|
||||
"get_logs",
|
||||
"get_heartbeats",
|
||||
"get_commands",
|
||||
"get_latest_heartbeats",
|
||||
"get_pending_command",
|
||||
"upsert_alert",
|
||||
"delete_alert",
|
||||
"get_alerts",
|
||||
]
|
||||
@@ -31,11 +31,11 @@ class DeviceTiers(str, Enum):
|
||||
class DeviceNetworkSettings(BaseModel):
|
||||
hostname: str = ""
|
||||
useStaticIP: bool = False
|
||||
ipAddress: Any = []
|
||||
gateway: Any = []
|
||||
subnet: Any = []
|
||||
dns1: Any = []
|
||||
dns2: Any = []
|
||||
ipAddress: List[str] = []
|
||||
gateway: List[str] = []
|
||||
subnet: List[str] = []
|
||||
dns1: List[str] = []
|
||||
dns2: List[str] = []
|
||||
|
||||
|
||||
class DeviceClockSettings(BaseModel):
|
||||
@@ -119,19 +119,13 @@ class DeviceCreate(BaseModel):
|
||||
device_subscription: DeviceSubInformation = DeviceSubInformation()
|
||||
device_stats: DeviceStatistics = DeviceStatistics()
|
||||
events_on: bool = False
|
||||
device_location_coordinates: Any = None # GeoPoint dict {lat, lng} or legacy str
|
||||
device_location_coordinates: str = ""
|
||||
device_melodies_all: List[MelodyMainItem] = []
|
||||
device_melodies_favorites: List[str] = []
|
||||
user_list: List[str] = []
|
||||
websocket_url: str = ""
|
||||
churchAssistantURL: str = ""
|
||||
staffNotes: str = ""
|
||||
hw_family: str = ""
|
||||
hw_revision: str = ""
|
||||
tags: List[str] = []
|
||||
serial_number: str = ""
|
||||
customer_id: str = ""
|
||||
mfg_status: str = ""
|
||||
|
||||
|
||||
class DeviceUpdate(BaseModel):
|
||||
@@ -144,23 +138,17 @@ class DeviceUpdate(BaseModel):
|
||||
device_subscription: Optional[Dict[str, Any]] = None
|
||||
device_stats: Optional[Dict[str, Any]] = None
|
||||
events_on: Optional[bool] = None
|
||||
device_location_coordinates: Optional[Any] = None # dict {lat, lng} or legacy str
|
||||
device_location_coordinates: Optional[str] = None
|
||||
device_melodies_all: Optional[List[MelodyMainItem]] = None
|
||||
device_melodies_favorites: Optional[List[str]] = None
|
||||
user_list: Optional[List[str]] = None
|
||||
websocket_url: Optional[str] = None
|
||||
churchAssistantURL: Optional[str] = None
|
||||
staffNotes: Optional[str] = None
|
||||
hw_family: Optional[str] = None
|
||||
hw_revision: Optional[str] = None
|
||||
tags: Optional[List[str]] = None
|
||||
customer_id: Optional[str] = None
|
||||
mfg_status: Optional[str] = None
|
||||
|
||||
|
||||
class DeviceInDB(DeviceCreate):
|
||||
id: str
|
||||
# Legacy field — kept for backwards compat; new docs use serial_number
|
||||
device_id: str = ""
|
||||
|
||||
|
||||
@@ -169,15 +157,6 @@ class DeviceListResponse(BaseModel):
|
||||
total: int
|
||||
|
||||
|
||||
class DeviceNoteCreate(BaseModel):
|
||||
content: str
|
||||
created_by: str = ""
|
||||
|
||||
|
||||
class DeviceNoteUpdate(BaseModel):
|
||||
content: str
|
||||
|
||||
|
||||
class DeviceUserInfo(BaseModel):
|
||||
"""User info resolved from device_users sub-collection or user_list."""
|
||||
user_id: str = ""
|
||||
|
||||
@@ -1,25 +1,17 @@
|
||||
import uuid
|
||||
from datetime import datetime
|
||||
from fastapi import APIRouter, Depends, Query, HTTPException
|
||||
from typing import Optional, List
|
||||
from pydantic import BaseModel
|
||||
from fastapi import APIRouter, Depends, Query
|
||||
from typing import Optional
|
||||
from auth.models import TokenPayload
|
||||
from auth.dependencies import require_permission
|
||||
from devices.models import (
|
||||
DeviceCreate, DeviceUpdate, DeviceInDB, DeviceListResponse,
|
||||
DeviceUsersResponse, DeviceUserInfo,
|
||||
DeviceNoteCreate, DeviceNoteUpdate,
|
||||
)
|
||||
from devices import service
|
||||
import database as mqtt_db
|
||||
from mqtt import database as mqtt_db
|
||||
from mqtt.models import DeviceAlertEntry, DeviceAlertsResponse
|
||||
from shared.firebase import get_db as get_firestore
|
||||
|
||||
router = APIRouter(prefix="/api/devices", tags=["devices"])
|
||||
|
||||
NOTES_COLLECTION = "notes"
|
||||
CRM_COLLECTION = "crm_customers"
|
||||
|
||||
|
||||
@router.get("", response_model=DeviceListResponse)
|
||||
async def list_devices(
|
||||
@@ -87,375 +79,3 @@ async def get_device_alerts(
|
||||
"""Return the current active alert set for a device. Empty list means fully healthy."""
|
||||
rows = await mqtt_db.get_alerts(device_id)
|
||||
return DeviceAlertsResponse(alerts=[DeviceAlertEntry(**r) for r in rows])
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
# Device Notes
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
@router.get("/{device_id}/notes")
|
||||
async def list_device_notes(
|
||||
device_id: str,
|
||||
_user: TokenPayload = Depends(require_permission("devices", "view")),
|
||||
):
|
||||
"""List all notes for a device."""
|
||||
db = get_firestore()
|
||||
docs = db.collection(NOTES_COLLECTION).where("device_id", "==", device_id).order_by("created_at").stream()
|
||||
notes = []
|
||||
for doc in docs:
|
||||
note = doc.to_dict()
|
||||
note["id"] = doc.id
|
||||
# Convert Firestore Timestamps to ISO strings
|
||||
for f in ("created_at", "updated_at"):
|
||||
if hasattr(note.get(f), "isoformat"):
|
||||
note[f] = note[f].isoformat()
|
||||
notes.append(note)
|
||||
return {"notes": notes, "total": len(notes)}
|
||||
|
||||
|
||||
@router.post("/{device_id}/notes", status_code=201)
|
||||
async def create_device_note(
|
||||
device_id: str,
|
||||
body: DeviceNoteCreate,
|
||||
_user: TokenPayload = Depends(require_permission("devices", "edit")),
|
||||
):
|
||||
"""Create a new note for a device."""
|
||||
db = get_firestore()
|
||||
now = datetime.utcnow()
|
||||
note_id = str(uuid.uuid4())
|
||||
note_data = {
|
||||
"device_id": device_id,
|
||||
"content": body.content,
|
||||
"created_by": body.created_by or _user.name or "",
|
||||
"created_at": now,
|
||||
"updated_at": now,
|
||||
}
|
||||
db.collection(NOTES_COLLECTION).document(note_id).set(note_data)
|
||||
note_data["id"] = note_id
|
||||
note_data["created_at"] = now.isoformat()
|
||||
note_data["updated_at"] = now.isoformat()
|
||||
return note_data
|
||||
|
||||
|
||||
@router.put("/{device_id}/notes/{note_id}")
|
||||
async def update_device_note(
|
||||
device_id: str,
|
||||
note_id: str,
|
||||
body: DeviceNoteUpdate,
|
||||
_user: TokenPayload = Depends(require_permission("devices", "edit")),
|
||||
):
|
||||
"""Update an existing device note."""
|
||||
db = get_firestore()
|
||||
doc_ref = db.collection(NOTES_COLLECTION).document(note_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists or doc.to_dict().get("device_id") != device_id:
|
||||
raise HTTPException(status_code=404, detail="Note not found")
|
||||
now = datetime.utcnow()
|
||||
doc_ref.update({"content": body.content, "updated_at": now})
|
||||
updated = doc.to_dict()
|
||||
updated["id"] = note_id
|
||||
updated["content"] = body.content
|
||||
updated["updated_at"] = now.isoformat()
|
||||
if hasattr(updated.get("created_at"), "isoformat"):
|
||||
updated["created_at"] = updated["created_at"].isoformat()
|
||||
return updated
|
||||
|
||||
|
||||
@router.delete("/{device_id}/notes/{note_id}", status_code=204)
|
||||
async def delete_device_note(
|
||||
device_id: str,
|
||||
note_id: str,
|
||||
_user: TokenPayload = Depends(require_permission("devices", "edit")),
|
||||
):
|
||||
"""Delete a device note."""
|
||||
db = get_firestore()
|
||||
doc_ref = db.collection(NOTES_COLLECTION).document(note_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists or doc.to_dict().get("device_id") != device_id:
|
||||
raise HTTPException(status_code=404, detail="Note not found")
|
||||
doc_ref.delete()
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
# Device Tags
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
class TagsUpdate(BaseModel):
|
||||
tags: List[str]
|
||||
|
||||
|
||||
@router.put("/{device_id}/tags", response_model=DeviceInDB)
|
||||
async def update_device_tags(
|
||||
device_id: str,
|
||||
body: TagsUpdate,
|
||||
_user: TokenPayload = Depends(require_permission("devices", "edit")),
|
||||
):
|
||||
"""Replace the tags list for a device."""
|
||||
return service.update_device(device_id, DeviceUpdate(tags=body.tags))
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
# Assign Device to Customer
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
class CustomerSearchResult(BaseModel):
|
||||
id: str
|
||||
name: str
|
||||
email: str
|
||||
organization: str = ""
|
||||
|
||||
|
||||
class AssignCustomerBody(BaseModel):
|
||||
customer_id: str
|
||||
label: str = ""
|
||||
|
||||
|
||||
@router.get("/{device_id}/customer-search")
|
||||
async def search_customers_for_device(
|
||||
device_id: str,
|
||||
q: str = Query(""),
|
||||
_user: TokenPayload = Depends(require_permission("devices", "view")),
|
||||
):
|
||||
"""Search customers by name, email, phone, org, or tags, returning top 20 matches."""
|
||||
db = get_firestore()
|
||||
docs = db.collection(CRM_COLLECTION).stream()
|
||||
results = []
|
||||
q_lower = q.lower().strip()
|
||||
for doc in docs:
|
||||
data = doc.to_dict()
|
||||
name = data.get("name", "") or ""
|
||||
surname = data.get("surname", "") or ""
|
||||
email = data.get("email", "") or ""
|
||||
organization = data.get("organization", "") or ""
|
||||
phone = data.get("phone", "") or ""
|
||||
tags = " ".join(data.get("tags", []) or [])
|
||||
location = data.get("location") or {}
|
||||
city = location.get("city", "") or ""
|
||||
searchable = f"{name} {surname} {email} {organization} {phone} {tags} {city}".lower()
|
||||
if not q_lower or q_lower in searchable:
|
||||
results.append({
|
||||
"id": doc.id,
|
||||
"name": name,
|
||||
"surname": surname,
|
||||
"email": email,
|
||||
"organization": organization,
|
||||
"city": city,
|
||||
})
|
||||
if len(results) >= 20:
|
||||
break
|
||||
return {"results": results}
|
||||
|
||||
|
||||
@router.post("/{device_id}/assign-customer")
|
||||
async def assign_device_to_customer(
|
||||
device_id: str,
|
||||
body: AssignCustomerBody,
|
||||
_user: TokenPayload = Depends(require_permission("devices", "edit")),
|
||||
):
|
||||
"""Assign a device to a customer.
|
||||
|
||||
- Sets owner field on the device document.
|
||||
- Adds a console_device entry to the customer's owned_items list.
|
||||
"""
|
||||
db = get_firestore()
|
||||
|
||||
# Verify device exists
|
||||
device = service.get_device(device_id)
|
||||
|
||||
# Get customer
|
||||
customer_ref = db.collection(CRM_COLLECTION).document(body.customer_id)
|
||||
customer_doc = customer_ref.get()
|
||||
if not customer_doc.exists:
|
||||
raise HTTPException(status_code=404, detail="Customer not found")
|
||||
customer_data = customer_doc.to_dict()
|
||||
customer_email = customer_data.get("email", "")
|
||||
|
||||
# Update device: owner email + customer_id
|
||||
device_ref = db.collection("devices").document(device_id)
|
||||
device_ref.update({"owner": customer_email, "customer_id": body.customer_id})
|
||||
|
||||
# Add to customer owned_items (avoid duplicates)
|
||||
owned_items = customer_data.get("owned_items", []) or []
|
||||
already_assigned = any(
|
||||
item.get("type") == "console_device" and item.get("console_device", {}).get("device_id") == device_id
|
||||
for item in owned_items
|
||||
)
|
||||
if not already_assigned:
|
||||
owned_items.append({
|
||||
"type": "console_device",
|
||||
"console_device": {
|
||||
"device_id": device_id,
|
||||
"label": body.label or device.device_name or device_id,
|
||||
}
|
||||
})
|
||||
customer_ref.update({"owned_items": owned_items})
|
||||
|
||||
return {"status": "assigned", "device_id": device_id, "customer_id": body.customer_id}
|
||||
|
||||
|
||||
@router.delete("/{device_id}/assign-customer", status_code=204)
|
||||
async def unassign_device_from_customer(
|
||||
device_id: str,
|
||||
customer_id: str = Query(...),
|
||||
_user: TokenPayload = Depends(require_permission("devices", "edit")),
|
||||
):
|
||||
"""Remove device assignment from a customer."""
|
||||
db = get_firestore()
|
||||
|
||||
# Clear customer_id on device
|
||||
device_ref = db.collection("devices").document(device_id)
|
||||
device_ref.update({"customer_id": ""})
|
||||
|
||||
# Remove from customer owned_items
|
||||
customer_ref = db.collection(CRM_COLLECTION).document(customer_id)
|
||||
customer_doc = customer_ref.get()
|
||||
if customer_doc.exists:
|
||||
customer_data = customer_doc.to_dict()
|
||||
owned_items = [
|
||||
item for item in (customer_data.get("owned_items") or [])
|
||||
if not (item.get("type") == "console_device" and item.get("console_device", {}).get("device_id") == device_id)
|
||||
]
|
||||
customer_ref.update({"owned_items": owned_items})
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
# Customer detail (for Owner display in fleet)
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
@router.get("/{device_id}/customer")
|
||||
async def get_device_customer(
|
||||
device_id: str,
|
||||
_user: TokenPayload = Depends(require_permission("devices", "view")),
|
||||
):
|
||||
"""Return basic customer details for a device's assigned customer_id."""
|
||||
db = get_firestore()
|
||||
device_ref = db.collection("devices").document(device_id)
|
||||
device_doc = device_ref.get()
|
||||
if not device_doc.exists:
|
||||
raise HTTPException(status_code=404, detail="Device not found")
|
||||
device_data = device_doc.to_dict() or {}
|
||||
customer_id = device_data.get("customer_id")
|
||||
if not customer_id:
|
||||
return {"customer": None}
|
||||
customer_doc = db.collection(CRM_COLLECTION).document(customer_id).get()
|
||||
if not customer_doc.exists:
|
||||
return {"customer": None}
|
||||
cd = customer_doc.to_dict() or {}
|
||||
return {
|
||||
"customer": {
|
||||
"id": customer_doc.id,
|
||||
"name": cd.get("name") or "",
|
||||
"email": cd.get("email") or "",
|
||||
"organization": cd.get("organization") or "",
|
||||
"phone": cd.get("phone") or "",
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
# User list management (for Manage tab — assign/remove users from user_list)
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
class UserSearchResult(BaseModel):
|
||||
id: str
|
||||
display_name: str = ""
|
||||
email: str = ""
|
||||
phone: str = ""
|
||||
photo_url: str = ""
|
||||
|
||||
|
||||
@router.get("/{device_id}/user-search")
|
||||
async def search_users_for_device(
|
||||
device_id: str,
|
||||
q: str = Query(""),
|
||||
_user: TokenPayload = Depends(require_permission("devices", "view")),
|
||||
):
|
||||
"""Search the users collection by name, email, or phone."""
|
||||
db = get_firestore()
|
||||
docs = db.collection("users").stream()
|
||||
results = []
|
||||
q_lower = q.lower().strip()
|
||||
for doc in docs:
|
||||
data = doc.to_dict() or {}
|
||||
name = (data.get("display_name") or "").lower()
|
||||
email = (data.get("email") or "").lower()
|
||||
phone = (data.get("phone") or "").lower()
|
||||
if not q_lower or q_lower in name or q_lower in email or q_lower in phone:
|
||||
results.append({
|
||||
"id": doc.id,
|
||||
"display_name": data.get("display_name") or "",
|
||||
"email": data.get("email") or "",
|
||||
"phone": data.get("phone") or "",
|
||||
"photo_url": data.get("photo_url") or "",
|
||||
})
|
||||
if len(results) >= 20:
|
||||
break
|
||||
return {"results": results}
|
||||
|
||||
|
||||
class AddUserBody(BaseModel):
|
||||
user_id: str
|
||||
|
||||
|
||||
@router.post("/{device_id}/user-list", status_code=200)
|
||||
async def add_user_to_device(
|
||||
device_id: str,
|
||||
body: AddUserBody,
|
||||
_user: TokenPayload = Depends(require_permission("devices", "edit")),
|
||||
):
|
||||
"""Add a user reference to the device's user_list field."""
|
||||
db = get_firestore()
|
||||
device_ref = db.collection("devices").document(device_id)
|
||||
device_doc = device_ref.get()
|
||||
if not device_doc.exists:
|
||||
raise HTTPException(status_code=404, detail="Device not found")
|
||||
|
||||
# Verify user exists
|
||||
user_doc = db.collection("users").document(body.user_id).get()
|
||||
if not user_doc.exists:
|
||||
raise HTTPException(status_code=404, detail="User not found")
|
||||
|
||||
data = device_doc.to_dict() or {}
|
||||
user_list = data.get("user_list", []) or []
|
||||
|
||||
# Avoid duplicates — check both string paths and DocumentReferences
|
||||
from google.cloud.firestore_v1 import DocumentReference as DocRef
|
||||
existing_ids = set()
|
||||
for entry in user_list:
|
||||
if isinstance(entry, DocRef):
|
||||
existing_ids.add(entry.id)
|
||||
elif isinstance(entry, str):
|
||||
existing_ids.add(entry.split("/")[-1])
|
||||
|
||||
if body.user_id not in existing_ids:
|
||||
user_ref = db.collection("users").document(body.user_id)
|
||||
user_list.append(user_ref)
|
||||
device_ref.update({"user_list": user_list})
|
||||
|
||||
return {"status": "added", "user_id": body.user_id}
|
||||
|
||||
|
||||
@router.delete("/{device_id}/user-list/{user_id}", status_code=200)
|
||||
async def remove_user_from_device(
|
||||
device_id: str,
|
||||
user_id: str,
|
||||
_user: TokenPayload = Depends(require_permission("devices", "edit")),
|
||||
):
|
||||
"""Remove a user reference from the device's user_list field."""
|
||||
db = get_firestore()
|
||||
device_ref = db.collection("devices").document(device_id)
|
||||
device_doc = device_ref.get()
|
||||
if not device_doc.exists:
|
||||
raise HTTPException(status_code=404, detail="Device not found")
|
||||
|
||||
data = device_doc.to_dict() or {}
|
||||
user_list = data.get("user_list", []) or []
|
||||
|
||||
# Remove any entry that resolves to this user_id
|
||||
new_list = [
|
||||
entry for entry in user_list
|
||||
if not (isinstance(entry, str) and entry.split("/")[-1] == user_id)
|
||||
]
|
||||
device_ref.update({"user_list": new_list})
|
||||
|
||||
return {"status": "removed", "user_id": user_id}
|
||||
|
||||
@@ -52,11 +52,10 @@ def _generate_serial_number() -> str:
|
||||
def _ensure_unique_serial(db) -> str:
|
||||
"""Generate a serial number and verify it doesn't already exist in Firestore."""
|
||||
existing_sns = set()
|
||||
for doc in db.collection(COLLECTION).select(["serial_number"]).stream():
|
||||
for doc in db.collection(COLLECTION).select(["device_id"]).stream():
|
||||
data = doc.to_dict()
|
||||
sn = data.get("serial_number") or data.get("device_id")
|
||||
if sn:
|
||||
existing_sns.add(sn)
|
||||
if data.get("device_id"):
|
||||
existing_sns.add(data["device_id"])
|
||||
|
||||
for _ in range(100): # safety limit
|
||||
sn = _generate_serial_number()
|
||||
@@ -72,7 +71,7 @@ def _convert_firestore_value(val):
|
||||
# Firestore DatetimeWithNanoseconds is a datetime subclass
|
||||
return val.strftime("%d %B %Y at %H:%M:%S UTC%z")
|
||||
if isinstance(val, GeoPoint):
|
||||
return {"lat": val.latitude, "lng": val.longitude}
|
||||
return f"{val.latitude}° N, {val.longitude}° E"
|
||||
if isinstance(val, DocumentReference):
|
||||
# Store the document path (e.g. "users/abc123")
|
||||
return val.path
|
||||
@@ -96,40 +95,18 @@ def _sanitize_dict(d: dict) -> dict:
|
||||
return result
|
||||
|
||||
|
||||
def _auto_upgrade_claimed(doc_ref, data: dict) -> dict:
|
||||
"""If the device has entries in user_list and isn't already claimed/decommissioned,
|
||||
upgrade mfg_status to 'claimed' automatically and return the updated data dict."""
|
||||
current_status = data.get("mfg_status", "")
|
||||
if current_status in ("claimed", "decommissioned"):
|
||||
return data
|
||||
user_list = data.get("user_list", []) or []
|
||||
if user_list:
|
||||
doc_ref.update({"mfg_status": "claimed"})
|
||||
data = dict(data)
|
||||
data["mfg_status"] = "claimed"
|
||||
return data
|
||||
|
||||
|
||||
def _doc_to_device(doc) -> DeviceInDB:
|
||||
"""Convert a Firestore document snapshot to a DeviceInDB model.
|
||||
|
||||
Also auto-upgrades mfg_status to 'claimed' if user_list is non-empty.
|
||||
"""
|
||||
raw = doc.to_dict()
|
||||
raw = _auto_upgrade_claimed(doc.reference, raw)
|
||||
data = _sanitize_dict(raw)
|
||||
"""Convert a Firestore document snapshot to a DeviceInDB model."""
|
||||
data = _sanitize_dict(doc.to_dict())
|
||||
return DeviceInDB(id=doc.id, **data)
|
||||
|
||||
|
||||
FLEET_STATUSES = {"sold", "claimed"}
|
||||
|
||||
|
||||
def list_devices(
|
||||
search: str | None = None,
|
||||
online_only: bool | None = None,
|
||||
subscription_tier: str | None = None,
|
||||
) -> list[DeviceInDB]:
|
||||
"""List fleet devices (sold + claimed only) with optional filters."""
|
||||
"""List devices with optional filters."""
|
||||
db = get_db()
|
||||
ref = db.collection(COLLECTION)
|
||||
query = ref
|
||||
@@ -141,14 +118,6 @@ def list_devices(
|
||||
results = []
|
||||
|
||||
for doc in docs:
|
||||
raw = doc.to_dict() or {}
|
||||
|
||||
# Only include sold/claimed devices in the fleet view.
|
||||
# Legacy devices without mfg_status are included to avoid breaking old data.
|
||||
mfg_status = raw.get("mfg_status")
|
||||
if mfg_status and mfg_status not in FLEET_STATUSES:
|
||||
continue
|
||||
|
||||
device = _doc_to_device(doc)
|
||||
|
||||
# Client-side filters
|
||||
@@ -159,7 +128,7 @@ def list_devices(
|
||||
search_lower = search.lower()
|
||||
name_match = search_lower in (device.device_name or "").lower()
|
||||
location_match = search_lower in (device.device_location or "").lower()
|
||||
sn_match = search_lower in (device.serial_number or "").lower()
|
||||
sn_match = search_lower in (device.device_id or "").lower()
|
||||
if not (name_match or location_match or sn_match):
|
||||
continue
|
||||
|
||||
@@ -213,11 +182,6 @@ def update_device(device_doc_id: str, data: DeviceUpdate) -> DeviceInDB:
|
||||
|
||||
update_data = data.model_dump(exclude_none=True)
|
||||
|
||||
# Convert {lat, lng} dict to a Firestore GeoPoint
|
||||
coords = update_data.get("device_location_coordinates")
|
||||
if isinstance(coords, dict) and "lat" in coords and "lng" in coords:
|
||||
update_data["device_location_coordinates"] = GeoPoint(coords["lat"], coords["lng"])
|
||||
|
||||
# Deep-merge nested structs so unmentioned sub-fields are preserved
|
||||
existing = doc.to_dict()
|
||||
nested_keys = (
|
||||
|
||||
@@ -4,7 +4,7 @@ from shared.firebase import get_db
|
||||
from shared.exceptions import NotFoundError
|
||||
from equipment.models import NoteCreate, NoteUpdate, NoteInDB
|
||||
|
||||
COLLECTION = "notes"
|
||||
COLLECTION = "equipment_notes"
|
||||
|
||||
VALID_CATEGORIES = {"general", "maintenance", "installation", "issue", "action_item", "other"}
|
||||
|
||||
|
||||
@@ -11,7 +11,7 @@ class UpdateType(str, Enum):
|
||||
|
||||
class FirmwareVersion(BaseModel):
|
||||
id: str
|
||||
hw_type: str # e.g. "vesper", "vesper_plus", "vesper_pro", "bespoke"
|
||||
hw_type: str # e.g. "vesper", "vesper_plus", "vesper_pro"
|
||||
channel: str # "stable", "beta", "alpha", "testing"
|
||||
version: str # semver e.g. "1.5"
|
||||
filename: str
|
||||
@@ -20,10 +20,8 @@ class FirmwareVersion(BaseModel):
|
||||
update_type: UpdateType = UpdateType.mandatory
|
||||
min_fw_version: Optional[str] = None # minimum fw version required to install this
|
||||
uploaded_at: str
|
||||
changelog: Optional[str] = None
|
||||
release_note: Optional[str] = None
|
||||
notes: Optional[str] = None
|
||||
is_latest: bool = False
|
||||
bespoke_uid: Optional[str] = None # only set when hw_type == "bespoke"
|
||||
|
||||
|
||||
class FirmwareListResponse(BaseModel):
|
||||
@@ -32,34 +30,17 @@ class FirmwareListResponse(BaseModel):
|
||||
|
||||
|
||||
class FirmwareMetadataResponse(BaseModel):
|
||||
"""Returned by both /latest and /{version}/info endpoints.
|
||||
|
||||
Two orthogonal axes:
|
||||
channel — the release track the device is subscribed to
|
||||
("stable" | "beta" | "development")
|
||||
Firmware validates this matches the channel it requested.
|
||||
update_type — the urgency of THIS release, set by the publisher
|
||||
("optional" | "mandatory" | "emergency")
|
||||
Firmware reads mandatory/emergency booleans derived from this.
|
||||
|
||||
Additional firmware-compatible fields:
|
||||
size — binary size in bytes (firmware reads "size", not "size_bytes")
|
||||
mandatory — True when update_type is mandatory or emergency
|
||||
emergency — True only when update_type is emergency
|
||||
"""
|
||||
"""Returned by both /latest and /{version}/info endpoints."""
|
||||
hw_type: str
|
||||
channel: str # release track — firmware validates this
|
||||
channel: str
|
||||
version: str
|
||||
size: int # firmware reads "size"
|
||||
size_bytes: int # kept for admin-panel consumers
|
||||
size_bytes: int
|
||||
sha256: str
|
||||
update_type: UpdateType # urgency enum — for admin panel display
|
||||
mandatory: bool # derived: update_type in (mandatory, emergency)
|
||||
emergency: bool # derived: update_type == emergency
|
||||
update_type: UpdateType
|
||||
min_fw_version: Optional[str] = None
|
||||
download_url: str
|
||||
uploaded_at: str
|
||||
release_note: Optional[str] = None
|
||||
notes: Optional[str] = None
|
||||
|
||||
|
||||
# Keep backwards-compatible alias
|
||||
|
||||
@@ -1,18 +1,13 @@
|
||||
from fastapi import APIRouter, Depends, Query, UploadFile, File, Form, HTTPException
|
||||
from fastapi.responses import FileResponse, PlainTextResponse
|
||||
from pydantic import BaseModel
|
||||
from fastapi import APIRouter, Depends, Query, UploadFile, File, Form
|
||||
from fastapi.responses import FileResponse
|
||||
from typing import Optional
|
||||
import logging
|
||||
|
||||
from auth.models import TokenPayload
|
||||
from auth.dependencies import require_permission
|
||||
from firmware.models import FirmwareVersion, FirmwareListResponse, FirmwareMetadataResponse, UpdateType
|
||||
from firmware import service
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
router = APIRouter(prefix="/api/firmware", tags=["firmware"])
|
||||
ota_router = APIRouter(prefix="/api/ota", tags=["ota-telemetry"])
|
||||
|
||||
|
||||
@router.post("/upload", response_model=FirmwareVersion, status_code=201)
|
||||
@@ -22,9 +17,7 @@ async def upload_firmware(
|
||||
version: str = Form(...),
|
||||
update_type: UpdateType = Form(UpdateType.mandatory),
|
||||
min_fw_version: Optional[str] = Form(None),
|
||||
changelog: Optional[str] = Form(None),
|
||||
release_note: Optional[str] = Form(None),
|
||||
bespoke_uid: Optional[str] = Form(None),
|
||||
notes: Optional[str] = Form(None),
|
||||
file: UploadFile = File(...),
|
||||
_user: TokenPayload = Depends(require_permission("manufacturing", "add")),
|
||||
):
|
||||
@@ -36,9 +29,7 @@ async def upload_firmware(
|
||||
file_bytes=file_bytes,
|
||||
update_type=update_type,
|
||||
min_fw_version=min_fw_version,
|
||||
changelog=changelog,
|
||||
release_note=release_note,
|
||||
bespoke_uid=bespoke_uid,
|
||||
notes=notes,
|
||||
)
|
||||
|
||||
|
||||
@@ -53,28 +44,11 @@ def list_firmware(
|
||||
|
||||
|
||||
@router.get("/{hw_type}/{channel}/latest", response_model=FirmwareMetadataResponse)
|
||||
def get_latest_firmware(
|
||||
hw_type: str,
|
||||
channel: str,
|
||||
hw_version: Optional[str] = Query(None, description="Hardware revision from NVS, e.g. '1.0'"),
|
||||
current_version: Optional[str] = Query(None, description="Currently running firmware semver, e.g. '1.2.3'"),
|
||||
):
|
||||
def get_latest_firmware(hw_type: str, channel: str):
|
||||
"""Returns metadata for the latest firmware for a given hw_type + channel.
|
||||
No auth required — devices call this endpoint to check for updates.
|
||||
"""
|
||||
return service.get_latest(hw_type, channel, hw_version=hw_version, current_version=current_version)
|
||||
|
||||
|
||||
@router.get("/{hw_type}/{channel}/latest/changelog", response_class=PlainTextResponse)
|
||||
def get_latest_changelog(hw_type: str, channel: str):
|
||||
"""Returns the full changelog for the latest firmware. Plain text."""
|
||||
return service.get_latest_changelog(hw_type, channel)
|
||||
|
||||
|
||||
@router.get("/{hw_type}/{channel}/{version}/info/changelog", response_class=PlainTextResponse)
|
||||
def get_version_changelog(hw_type: str, channel: str, version: str):
|
||||
"""Returns the full changelog for a specific firmware version. Plain text."""
|
||||
return service.get_version_changelog(hw_type, channel, version)
|
||||
return service.get_latest(hw_type, channel)
|
||||
|
||||
|
||||
@router.get("/{hw_type}/{channel}/{version}/info", response_model=FirmwareMetadataResponse)
|
||||
@@ -96,85 +70,9 @@ def download_firmware(hw_type: str, channel: str, version: str):
|
||||
)
|
||||
|
||||
|
||||
@router.put("/{firmware_id}", response_model=FirmwareVersion)
|
||||
async def edit_firmware(
|
||||
firmware_id: str,
|
||||
channel: Optional[str] = Form(None),
|
||||
version: Optional[str] = Form(None),
|
||||
update_type: Optional[UpdateType] = Form(None),
|
||||
min_fw_version: Optional[str] = Form(None),
|
||||
changelog: Optional[str] = Form(None),
|
||||
release_note: Optional[str] = Form(None),
|
||||
bespoke_uid: Optional[str] = Form(None),
|
||||
file: Optional[UploadFile] = File(None),
|
||||
_user: TokenPayload = Depends(require_permission("manufacturing", "add")),
|
||||
):
|
||||
file_bytes = await file.read() if file and file.filename else None
|
||||
return service.edit_firmware(
|
||||
doc_id=firmware_id,
|
||||
channel=channel,
|
||||
version=version,
|
||||
update_type=update_type,
|
||||
min_fw_version=min_fw_version,
|
||||
changelog=changelog,
|
||||
release_note=release_note,
|
||||
bespoke_uid=bespoke_uid,
|
||||
file_bytes=file_bytes,
|
||||
)
|
||||
|
||||
|
||||
@router.delete("/{firmware_id}", status_code=204)
|
||||
def delete_firmware(
|
||||
firmware_id: str,
|
||||
_user: TokenPayload = Depends(require_permission("manufacturing", "delete")),
|
||||
):
|
||||
service.delete_firmware(firmware_id)
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
# OTA event telemetry — called by devices (no auth, best-effort)
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
class OtaDownloadEvent(BaseModel):
|
||||
device_uid: str
|
||||
hw_type: str
|
||||
hw_version: str
|
||||
from_version: str
|
||||
to_version: str
|
||||
channel: str
|
||||
|
||||
|
||||
class OtaFlashEvent(BaseModel):
|
||||
device_uid: str
|
||||
hw_type: str
|
||||
hw_version: str
|
||||
from_version: str
|
||||
to_version: str
|
||||
channel: str
|
||||
sha256: str
|
||||
|
||||
|
||||
@ota_router.post("/events/download", status_code=204)
|
||||
def ota_event_download(event: OtaDownloadEvent):
|
||||
"""Device reports that firmware was fully written to flash (pre-commit).
|
||||
No auth required — best-effort telemetry from the device.
|
||||
"""
|
||||
logger.info(
|
||||
"OTA download event: device=%s hw=%s/%s %s → %s (channel=%s)",
|
||||
event.device_uid, event.hw_type, event.hw_version,
|
||||
event.from_version, event.to_version, event.channel,
|
||||
)
|
||||
service.record_ota_event("download", event.model_dump())
|
||||
|
||||
|
||||
@ota_router.post("/events/flash", status_code=204)
|
||||
def ota_event_flash(event: OtaFlashEvent):
|
||||
"""Device reports that firmware partition was committed and device is rebooting.
|
||||
No auth required — best-effort telemetry from the device.
|
||||
"""
|
||||
logger.info(
|
||||
"OTA flash event: device=%s hw=%s/%s %s → %s (channel=%s sha256=%.16s...)",
|
||||
event.device_uid, event.hw_type, event.hw_version,
|
||||
event.from_version, event.to_version, event.channel, event.sha256,
|
||||
)
|
||||
service.record_ota_event("flash", event.model_dump())
|
||||
|
||||
@@ -1,9 +1,7 @@
|
||||
import hashlib
|
||||
import logging
|
||||
import uuid
|
||||
from datetime import datetime, timezone
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from fastapi import HTTPException
|
||||
|
||||
@@ -12,11 +10,9 @@ from shared.firebase import get_db
|
||||
from shared.exceptions import NotFoundError
|
||||
from firmware.models import FirmwareVersion, FirmwareMetadataResponse, UpdateType
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
COLLECTION = "firmware_versions"
|
||||
|
||||
VALID_HW_TYPES = {"vesper", "vesper_plus", "vesper_pro", "chronos", "chronos_pro", "agnus", "agnus_mini", "bespoke"}
|
||||
VALID_HW_TYPES = {"vesper", "vesper_plus", "vesper_pro", "chronos", "chronos_pro", "agnus", "agnus_mini"}
|
||||
VALID_CHANNELS = {"stable", "beta", "alpha", "testing"}
|
||||
|
||||
|
||||
@@ -43,31 +39,24 @@ def _doc_to_firmware_version(doc) -> FirmwareVersion:
|
||||
update_type=data.get("update_type", UpdateType.mandatory),
|
||||
min_fw_version=data.get("min_fw_version"),
|
||||
uploaded_at=uploaded_str,
|
||||
changelog=data.get("changelog"),
|
||||
release_note=data.get("release_note"),
|
||||
notes=data.get("notes"),
|
||||
is_latest=data.get("is_latest", False),
|
||||
bespoke_uid=data.get("bespoke_uid"),
|
||||
)
|
||||
|
||||
|
||||
def _fw_to_metadata_response(fw: FirmwareVersion) -> FirmwareMetadataResponse:
|
||||
download_url = f"/api/firmware/{fw.hw_type}/{fw.channel}/{fw.version}/firmware.bin"
|
||||
is_emergency = fw.update_type == UpdateType.emergency
|
||||
is_mandatory = fw.update_type in (UpdateType.mandatory, UpdateType.emergency)
|
||||
return FirmwareMetadataResponse(
|
||||
hw_type=fw.hw_type,
|
||||
channel=fw.channel, # firmware validates this matches requested channel
|
||||
channel=fw.channel,
|
||||
version=fw.version,
|
||||
size=fw.size_bytes, # firmware reads "size"
|
||||
size_bytes=fw.size_bytes, # kept for admin-panel consumers
|
||||
size_bytes=fw.size_bytes,
|
||||
sha256=fw.sha256,
|
||||
update_type=fw.update_type, # urgency enum — for admin panel display
|
||||
mandatory=is_mandatory, # firmware reads this to decide auto-apply
|
||||
emergency=is_emergency, # firmware reads this to decide immediate apply
|
||||
update_type=fw.update_type,
|
||||
min_fw_version=fw.min_fw_version,
|
||||
download_url=download_url,
|
||||
uploaded_at=fw.uploaded_at,
|
||||
release_note=fw.release_note,
|
||||
notes=fw.notes,
|
||||
)
|
||||
|
||||
|
||||
@@ -78,50 +67,24 @@ def upload_firmware(
|
||||
file_bytes: bytes,
|
||||
update_type: UpdateType = UpdateType.mandatory,
|
||||
min_fw_version: str | None = None,
|
||||
changelog: str | None = None,
|
||||
release_note: str | None = None,
|
||||
bespoke_uid: str | None = None,
|
||||
notes: str | None = None,
|
||||
) -> FirmwareVersion:
|
||||
if hw_type not in VALID_HW_TYPES:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid hw_type. Must be one of: {', '.join(sorted(VALID_HW_TYPES))}")
|
||||
if channel not in VALID_CHANNELS:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid channel. Must be one of: {', '.join(sorted(VALID_CHANNELS))}")
|
||||
if hw_type == "bespoke" and not bespoke_uid:
|
||||
raise HTTPException(status_code=400, detail="bespoke_uid is required when hw_type is 'bespoke'")
|
||||
|
||||
db = get_db()
|
||||
sha256 = hashlib.sha256(file_bytes).hexdigest()
|
||||
now = datetime.now(timezone.utc)
|
||||
|
||||
# For bespoke firmware: if a firmware with the same bespoke_uid already exists,
|
||||
# overwrite it (delete old doc + file, reuse same storage path keyed by uid).
|
||||
if hw_type == "bespoke" and bespoke_uid:
|
||||
existing_docs = list(
|
||||
db.collection(COLLECTION)
|
||||
.where("hw_type", "==", "bespoke")
|
||||
.where("bespoke_uid", "==", bespoke_uid)
|
||||
.stream()
|
||||
)
|
||||
for old_doc in existing_docs:
|
||||
old_data = old_doc.to_dict() or {}
|
||||
old_path = _storage_path("bespoke", old_data.get("channel", channel), old_data.get("version", version))
|
||||
if old_path.exists():
|
||||
old_path.unlink()
|
||||
try:
|
||||
old_path.parent.rmdir()
|
||||
except OSError:
|
||||
pass
|
||||
old_doc.reference.delete()
|
||||
|
||||
dest = _storage_path(hw_type, channel, version)
|
||||
dest.parent.mkdir(parents=True, exist_ok=True)
|
||||
dest.write_bytes(file_bytes)
|
||||
|
||||
sha256 = hashlib.sha256(file_bytes).hexdigest()
|
||||
now = datetime.now(timezone.utc)
|
||||
doc_id = str(uuid.uuid4())
|
||||
|
||||
db = get_db()
|
||||
|
||||
# Mark previous latest for this hw_type+channel as no longer latest
|
||||
# (skip for bespoke — each bespoke_uid is its own independent firmware)
|
||||
if hw_type != "bespoke":
|
||||
prev_docs = (
|
||||
db.collection(COLLECTION)
|
||||
.where("hw_type", "==", hw_type)
|
||||
@@ -143,10 +106,8 @@ def upload_firmware(
|
||||
"update_type": update_type.value,
|
||||
"min_fw_version": min_fw_version,
|
||||
"uploaded_at": now,
|
||||
"changelog": changelog,
|
||||
"release_note": release_note,
|
||||
"notes": notes,
|
||||
"is_latest": True,
|
||||
"bespoke_uid": bespoke_uid,
|
||||
})
|
||||
|
||||
return _doc_to_firmware_version(doc_ref.get())
|
||||
@@ -169,11 +130,9 @@ def list_firmware(
|
||||
return items
|
||||
|
||||
|
||||
def get_latest(hw_type: str, channel: str, hw_version: str | None = None, current_version: str | None = None) -> FirmwareMetadataResponse:
|
||||
def get_latest(hw_type: str, channel: str) -> FirmwareMetadataResponse:
|
||||
if hw_type not in VALID_HW_TYPES:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid hw_type '{hw_type}'")
|
||||
if hw_type == "bespoke":
|
||||
raise HTTPException(status_code=400, detail="Bespoke firmware is not served via auto-update. Use the direct download URL.")
|
||||
if channel not in VALID_CHANNELS:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid channel '{channel}'")
|
||||
|
||||
@@ -214,52 +173,6 @@ def get_version_info(hw_type: str, channel: str, version: str) -> FirmwareMetada
|
||||
return _fw_to_metadata_response(_doc_to_firmware_version(docs[0]))
|
||||
|
||||
|
||||
def get_latest_changelog(hw_type: str, channel: str) -> str:
|
||||
if hw_type not in VALID_HW_TYPES:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid hw_type '{hw_type}'")
|
||||
if channel not in VALID_CHANNELS:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid channel '{channel}'")
|
||||
|
||||
db = get_db()
|
||||
docs = list(
|
||||
db.collection(COLLECTION)
|
||||
.where("hw_type", "==", hw_type)
|
||||
.where("channel", "==", channel)
|
||||
.where("is_latest", "==", True)
|
||||
.limit(1)
|
||||
.stream()
|
||||
)
|
||||
if not docs:
|
||||
raise NotFoundError("Firmware")
|
||||
fw = _doc_to_firmware_version(docs[0])
|
||||
if not fw.changelog:
|
||||
raise NotFoundError("Changelog")
|
||||
return fw.changelog
|
||||
|
||||
|
||||
def get_version_changelog(hw_type: str, channel: str, version: str) -> str:
|
||||
if hw_type not in VALID_HW_TYPES:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid hw_type '{hw_type}'")
|
||||
if channel not in VALID_CHANNELS:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid channel '{channel}'")
|
||||
|
||||
db = get_db()
|
||||
docs = list(
|
||||
db.collection(COLLECTION)
|
||||
.where("hw_type", "==", hw_type)
|
||||
.where("channel", "==", channel)
|
||||
.where("version", "==", version)
|
||||
.limit(1)
|
||||
.stream()
|
||||
)
|
||||
if not docs:
|
||||
raise NotFoundError("Firmware version")
|
||||
fw = _doc_to_firmware_version(docs[0])
|
||||
if not fw.changelog:
|
||||
raise NotFoundError("Changelog")
|
||||
return fw.changelog
|
||||
|
||||
|
||||
def get_firmware_path(hw_type: str, channel: str, version: str) -> Path:
|
||||
path = _storage_path(hw_type, channel, version)
|
||||
if not path.exists():
|
||||
@@ -267,98 +180,6 @@ def get_firmware_path(hw_type: str, channel: str, version: str) -> Path:
|
||||
return path
|
||||
|
||||
|
||||
def record_ota_event(event_type: str, payload: dict[str, Any]) -> None:
|
||||
"""Persist an OTA telemetry event (download or flash) to Firestore.
|
||||
|
||||
Best-effort — caller should not raise on failure.
|
||||
"""
|
||||
try:
|
||||
db = get_db()
|
||||
db.collection("ota_events").add({
|
||||
"event_type": event_type,
|
||||
"received_at": datetime.now(timezone.utc),
|
||||
**payload,
|
||||
})
|
||||
except Exception as exc:
|
||||
logger.warning("Failed to persist OTA event (%s): %s", event_type, exc)
|
||||
|
||||
|
||||
def edit_firmware(
|
||||
doc_id: str,
|
||||
channel: str | None = None,
|
||||
version: str | None = None,
|
||||
update_type: UpdateType | None = None,
|
||||
min_fw_version: str | None = None,
|
||||
changelog: str | None = None,
|
||||
release_note: str | None = None,
|
||||
bespoke_uid: str | None = None,
|
||||
file_bytes: bytes | None = None,
|
||||
) -> FirmwareVersion:
|
||||
db = get_db()
|
||||
doc_ref = db.collection(COLLECTION).document(doc_id)
|
||||
doc = doc_ref.get()
|
||||
if not doc.exists:
|
||||
raise NotFoundError("Firmware")
|
||||
|
||||
data = doc.to_dict() or {}
|
||||
hw_type = data["hw_type"]
|
||||
old_channel = data.get("channel", "")
|
||||
old_version = data.get("version", "")
|
||||
|
||||
effective_channel = channel if channel is not None else old_channel
|
||||
effective_version = version if version is not None else old_version
|
||||
|
||||
if channel is not None and channel not in VALID_CHANNELS:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid channel. Must be one of: {', '.join(sorted(VALID_CHANNELS))}")
|
||||
|
||||
updates: dict = {}
|
||||
if channel is not None:
|
||||
updates["channel"] = channel
|
||||
if version is not None:
|
||||
updates["version"] = version
|
||||
if update_type is not None:
|
||||
updates["update_type"] = update_type.value
|
||||
if min_fw_version is not None:
|
||||
updates["min_fw_version"] = min_fw_version if min_fw_version else None
|
||||
if changelog is not None:
|
||||
updates["changelog"] = changelog if changelog else None
|
||||
if release_note is not None:
|
||||
updates["release_note"] = release_note if release_note else None
|
||||
if bespoke_uid is not None:
|
||||
updates["bespoke_uid"] = bespoke_uid if bespoke_uid else None
|
||||
|
||||
if file_bytes is not None:
|
||||
# Move binary if path changed
|
||||
old_path = _storage_path(hw_type, old_channel, old_version)
|
||||
new_path = _storage_path(hw_type, effective_channel, effective_version)
|
||||
if old_path != new_path and old_path.exists():
|
||||
old_path.unlink()
|
||||
try:
|
||||
old_path.parent.rmdir()
|
||||
except OSError:
|
||||
pass
|
||||
new_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
new_path.write_bytes(file_bytes)
|
||||
updates["sha256"] = hashlib.sha256(file_bytes).hexdigest()
|
||||
updates["size_bytes"] = len(file_bytes)
|
||||
elif (channel is not None and channel != old_channel) or (version is not None and version != old_version):
|
||||
# Path changed but no new file — move existing binary
|
||||
old_path = _storage_path(hw_type, old_channel, old_version)
|
||||
new_path = _storage_path(hw_type, effective_channel, effective_version)
|
||||
if old_path.exists() and old_path != new_path:
|
||||
new_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
old_path.rename(new_path)
|
||||
try:
|
||||
old_path.parent.rmdir()
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
if updates:
|
||||
doc_ref.update(updates)
|
||||
|
||||
return _doc_to_firmware_version(doc_ref.get())
|
||||
|
||||
|
||||
def delete_firmware(doc_id: str) -> None:
|
||||
db = get_db()
|
||||
doc_ref = db.collection(COLLECTION).document(doc_id)
|
||||
@@ -390,9 +211,9 @@ def delete_firmware(doc_id: str) -> None:
|
||||
db.collection(COLLECTION)
|
||||
.where("hw_type", "==", hw_type)
|
||||
.where("channel", "==", channel)
|
||||
.order_by("uploaded_at", direction="DESCENDING")
|
||||
.limit(1)
|
||||
.stream()
|
||||
)
|
||||
if remaining:
|
||||
# Sort in Python to avoid needing a composite Firestore index
|
||||
remaining.sort(key=lambda d: d.to_dict().get("uploaded_at", ""), reverse=True)
|
||||
remaining[0].reference.update({"is_latest": True})
|
||||
|
||||
@@ -15,20 +15,19 @@ from staff.router import router as staff_router
|
||||
from helpdesk.router import router as helpdesk_router
|
||||
from builder.router import router as builder_router
|
||||
from manufacturing.router import router as manufacturing_router
|
||||
from firmware.router import router as firmware_router, ota_router
|
||||
from firmware.router import router as firmware_router
|
||||
from admin.router import router as admin_router
|
||||
from crm.router import router as crm_products_router
|
||||
from crm.customers_router import router as crm_customers_router
|
||||
from crm.orders_router import router as crm_orders_router, global_router as crm_orders_global_router
|
||||
from crm.orders_router import router as crm_orders_router
|
||||
from crm.comms_router import router as crm_comms_router
|
||||
from crm.media_router import router as crm_media_router
|
||||
from crm.nextcloud_router import router as crm_nextcloud_router
|
||||
from crm.quotations_router import router as crm_quotations_router
|
||||
from public.router import router as public_router
|
||||
from crm.nextcloud import close_client as close_nextcloud_client, keepalive_ping as nextcloud_keepalive
|
||||
from crm.mail_accounts import get_mail_accounts
|
||||
from mqtt.client import mqtt_manager
|
||||
import database as db
|
||||
from mqtt import database as mqtt_db
|
||||
from melodies import service as melody_service
|
||||
|
||||
app = FastAPI(
|
||||
@@ -59,17 +58,14 @@ app.include_router(staff_router)
|
||||
app.include_router(builder_router)
|
||||
app.include_router(manufacturing_router)
|
||||
app.include_router(firmware_router)
|
||||
app.include_router(ota_router)
|
||||
app.include_router(admin_router)
|
||||
app.include_router(crm_products_router)
|
||||
app.include_router(crm_customers_router)
|
||||
app.include_router(crm_orders_router)
|
||||
app.include_router(crm_orders_global_router)
|
||||
app.include_router(crm_comms_router)
|
||||
app.include_router(crm_media_router)
|
||||
app.include_router(crm_nextcloud_router)
|
||||
app.include_router(crm_quotations_router)
|
||||
app.include_router(public_router)
|
||||
|
||||
|
||||
async def nextcloud_keepalive_loop():
|
||||
@@ -89,25 +85,14 @@ async def email_sync_loop():
|
||||
print(f"[EMAIL SYNC] Error: {e}")
|
||||
|
||||
|
||||
async def crm_poll_loop():
|
||||
while True:
|
||||
await asyncio.sleep(24 * 60 * 60) # once per day
|
||||
try:
|
||||
from crm.service import poll_crm_customer_statuses
|
||||
poll_crm_customer_statuses()
|
||||
except Exception as e:
|
||||
print(f"[CRM POLL] Error: {e}")
|
||||
|
||||
|
||||
@app.on_event("startup")
|
||||
async def startup():
|
||||
init_firebase()
|
||||
await db.init_db()
|
||||
await mqtt_db.init_db()
|
||||
await melody_service.migrate_from_firestore()
|
||||
mqtt_manager.start(asyncio.get_event_loop())
|
||||
asyncio.create_task(db.purge_loop())
|
||||
asyncio.create_task(mqtt_db.purge_loop())
|
||||
asyncio.create_task(nextcloud_keepalive_loop())
|
||||
asyncio.create_task(crm_poll_loop())
|
||||
sync_accounts = [a for a in get_mail_accounts() if a.get("sync_inbound") and a.get("imap_host")]
|
||||
if sync_accounts:
|
||||
print(f"[EMAIL SYNC] IMAP configured for {len(sync_accounts)} account(s) - starting sync loop")
|
||||
@@ -119,7 +104,7 @@ async def startup():
|
||||
@app.on_event("shutdown")
|
||||
async def shutdown():
|
||||
mqtt_manager.stop()
|
||||
await db.close_db()
|
||||
await mqtt_db.close_db()
|
||||
await close_nextcloud_client()
|
||||
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import json
|
||||
import logging
|
||||
from database import get_db
|
||||
from mqtt.database import get_db
|
||||
|
||||
logger = logging.getLogger("manufacturing.audit")
|
||||
|
||||
|
||||
@@ -15,7 +15,7 @@ class BoardType(str, Enum):
|
||||
|
||||
BOARD_TYPE_LABELS = {
|
||||
"vesper": "Vesper",
|
||||
"vesper_plus": "Vesper Plus",
|
||||
"vesper_plus": "Vesper+",
|
||||
"vesper_pro": "Vesper Pro",
|
||||
"chronos": "Chronos",
|
||||
"chronos_pro": "Chronos Pro",
|
||||
@@ -23,28 +23,6 @@ BOARD_TYPE_LABELS = {
|
||||
"agnus": "Agnus",
|
||||
}
|
||||
|
||||
# Family codes (BS + 4 chars = segment 1 of serial number)
|
||||
BOARD_FAMILY_CODES = {
|
||||
"vesper": "VSPR",
|
||||
"vesper_plus": "VSPR",
|
||||
"vesper_pro": "VSPR",
|
||||
"agnus": "AGNS",
|
||||
"agnus_mini": "AGNS",
|
||||
"chronos": "CRNS",
|
||||
"chronos_pro": "CRNS",
|
||||
}
|
||||
|
||||
# Variant codes (3 chars = first part of segment 3 of serial number)
|
||||
BOARD_VARIANT_CODES = {
|
||||
"vesper": "STD",
|
||||
"vesper_plus": "PLS",
|
||||
"vesper_pro": "PRO",
|
||||
"agnus": "STD",
|
||||
"agnus_mini": "MIN",
|
||||
"chronos": "STD",
|
||||
"chronos_pro": "PRO",
|
||||
}
|
||||
|
||||
|
||||
class MfgStatus(str, Enum):
|
||||
manufactured = "manufactured"
|
||||
@@ -55,13 +33,6 @@ class MfgStatus(str, Enum):
|
||||
decommissioned = "decommissioned"
|
||||
|
||||
|
||||
class LifecycleEntry(BaseModel):
|
||||
status_id: str
|
||||
date: str # ISO 8601 UTC string
|
||||
note: Optional[str] = None
|
||||
set_by: Optional[str] = None
|
||||
|
||||
|
||||
class BatchCreate(BaseModel):
|
||||
board_type: BoardType
|
||||
board_version: str = Field(
|
||||
@@ -91,9 +62,6 @@ class DeviceInventoryItem(BaseModel):
|
||||
owner: Optional[str] = None
|
||||
assigned_to: Optional[str] = None
|
||||
device_name: Optional[str] = None
|
||||
lifecycle_history: Optional[List["LifecycleEntry"]] = None
|
||||
customer_id: Optional[str] = None
|
||||
user_list: Optional[List[str]] = None
|
||||
|
||||
|
||||
class DeviceInventoryListResponse(BaseModel):
|
||||
@@ -104,19 +72,11 @@ class DeviceInventoryListResponse(BaseModel):
|
||||
class DeviceStatusUpdate(BaseModel):
|
||||
status: MfgStatus
|
||||
note: Optional[str] = None
|
||||
force_claimed: bool = False
|
||||
|
||||
|
||||
class DeviceAssign(BaseModel):
|
||||
customer_id: str
|
||||
|
||||
|
||||
class CustomerSearchResult(BaseModel):
|
||||
id: str
|
||||
name: str = ""
|
||||
email: str = ""
|
||||
organization: str = ""
|
||||
phone: str = ""
|
||||
customer_email: str
|
||||
customer_name: Optional[str] = None
|
||||
|
||||
|
||||
class RecentActivityItem(BaseModel):
|
||||
|
||||
@@ -1,8 +1,7 @@
|
||||
from fastapi import APIRouter, Depends, Query, HTTPException, UploadFile, File
|
||||
from fastapi import APIRouter, Depends, Query, HTTPException
|
||||
from fastapi.responses import Response
|
||||
from fastapi.responses import RedirectResponse
|
||||
from typing import Optional
|
||||
from pydantic import BaseModel
|
||||
|
||||
from auth.models import TokenPayload
|
||||
from auth.dependencies import require_permission
|
||||
@@ -15,23 +14,6 @@ from manufacturing.models import (
|
||||
from manufacturing import service
|
||||
from manufacturing import audit
|
||||
from shared.exceptions import NotFoundError
|
||||
from shared.firebase import get_db as get_firestore
|
||||
|
||||
|
||||
class LifecycleEntryPatch(BaseModel):
|
||||
index: int
|
||||
date: Optional[str] = None
|
||||
note: Optional[str] = None
|
||||
|
||||
class LifecycleEntryCreate(BaseModel):
|
||||
status_id: str
|
||||
date: Optional[str] = None
|
||||
note: Optional[str] = None
|
||||
|
||||
VALID_FLASH_ASSETS = {"bootloader.bin", "partitions.bin"}
|
||||
VALID_HW_TYPES_MFG = {"vesper", "vesper_plus", "vesper_pro", "agnus", "agnus_mini", "chronos", "chronos_pro"}
|
||||
# Bespoke UIDs are dynamic — we allow any non-empty slug that doesn't clash with
|
||||
# a standard hw_type name. The flash-asset upload endpoint checks this below.
|
||||
|
||||
router = APIRouter(prefix="/api/manufacturing", tags=["manufacturing"])
|
||||
|
||||
@@ -98,75 +80,13 @@ def get_device(
|
||||
return service.get_device_by_sn(sn)
|
||||
|
||||
|
||||
@router.get("/customers/search")
|
||||
def search_customers(
|
||||
q: str = Query(""),
|
||||
_user: TokenPayload = Depends(require_permission("manufacturing", "view")),
|
||||
):
|
||||
"""Search CRM customers by name, email, phone, organization, or tags."""
|
||||
results = service.search_customers(q)
|
||||
return {"results": results}
|
||||
|
||||
|
||||
@router.get("/customers/{customer_id}")
|
||||
def get_customer(
|
||||
customer_id: str,
|
||||
_user: TokenPayload = Depends(require_permission("manufacturing", "view")),
|
||||
):
|
||||
"""Get a single CRM customer by ID."""
|
||||
db = get_firestore()
|
||||
doc = db.collection("crm_customers").document(customer_id).get()
|
||||
if not doc.exists:
|
||||
raise HTTPException(status_code=404, detail="Customer not found")
|
||||
data = doc.to_dict() or {}
|
||||
loc = data.get("location") or {}
|
||||
city = loc.get("city") if isinstance(loc, dict) else None
|
||||
return {
|
||||
"id": doc.id,
|
||||
"name": data.get("name") or "",
|
||||
"surname": data.get("surname") or "",
|
||||
"email": data.get("email") or "",
|
||||
"organization": data.get("organization") or "",
|
||||
"phone": data.get("phone") or "",
|
||||
"city": city or "",
|
||||
}
|
||||
|
||||
|
||||
@router.patch("/devices/{sn}/status", response_model=DeviceInventoryItem)
|
||||
async def update_status(
|
||||
sn: str,
|
||||
body: DeviceStatusUpdate,
|
||||
user: TokenPayload = Depends(require_permission("manufacturing", "edit")),
|
||||
):
|
||||
# Guard: claimed requires at least one user in user_list
|
||||
# (allow if explicitly force_claimed=true, which the mfg UI sets after adding a user manually)
|
||||
if body.status.value == "claimed":
|
||||
db = get_firestore()
|
||||
docs = list(db.collection("devices").where("serial_number", "==", sn).limit(1).stream())
|
||||
if docs:
|
||||
data = docs[0].to_dict() or {}
|
||||
user_list = data.get("user_list", []) or []
|
||||
if not user_list and not getattr(body, "force_claimed", False):
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail="Cannot set status to 'claimed': device has no users in user_list. "
|
||||
"Assign a user first, then set to Claimed.",
|
||||
)
|
||||
|
||||
# Guard: sold requires a customer assigned
|
||||
if body.status.value == "sold":
|
||||
db = get_firestore()
|
||||
docs = list(db.collection("devices").where("serial_number", "==", sn).limit(1).stream())
|
||||
if docs:
|
||||
data = docs[0].to_dict() or {}
|
||||
if not data.get("customer_id"):
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail="Cannot set status to 'sold' without an assigned customer. "
|
||||
"Use the 'Assign to Customer' action first.",
|
||||
)
|
||||
|
||||
result = service.update_device_status(sn, body, set_by=user.email)
|
||||
result = service.update_device_status(sn, body)
|
||||
await audit.log_action(
|
||||
admin_user=user.email,
|
||||
action="status_updated",
|
||||
@@ -176,108 +96,12 @@ async def update_status(
|
||||
return result
|
||||
|
||||
|
||||
@router.patch("/devices/{sn}/lifecycle", response_model=DeviceInventoryItem)
|
||||
async def patch_lifecycle_entry(
|
||||
sn: str,
|
||||
body: LifecycleEntryPatch,
|
||||
user: TokenPayload = Depends(require_permission("manufacturing", "edit")),
|
||||
):
|
||||
"""Edit the date and/or note of a lifecycle history entry by index."""
|
||||
db = get_firestore()
|
||||
docs = list(db.collection("devices").where("serial_number", "==", sn).limit(1).stream())
|
||||
if not docs:
|
||||
raise HTTPException(status_code=404, detail="Device not found")
|
||||
doc_ref = docs[0].reference
|
||||
data = docs[0].to_dict() or {}
|
||||
history = data.get("lifecycle_history") or []
|
||||
if body.index < 0 or body.index >= len(history):
|
||||
raise HTTPException(status_code=400, detail="Invalid lifecycle entry index")
|
||||
if body.date is not None:
|
||||
history[body.index]["date"] = body.date
|
||||
if body.note is not None:
|
||||
history[body.index]["note"] = body.note
|
||||
doc_ref.update({"lifecycle_history": history})
|
||||
from manufacturing.service import _doc_to_inventory_item
|
||||
return _doc_to_inventory_item(doc_ref.get())
|
||||
|
||||
|
||||
@router.post("/devices/{sn}/lifecycle", response_model=DeviceInventoryItem, status_code=200)
|
||||
async def create_lifecycle_entry(
|
||||
sn: str,
|
||||
body: LifecycleEntryCreate,
|
||||
user: TokenPayload = Depends(require_permission("manufacturing", "edit")),
|
||||
):
|
||||
"""Upsert a lifecycle history entry for the given status_id.
|
||||
|
||||
If an entry for this status already exists it is overwritten in-place;
|
||||
otherwise a new entry is appended. This prevents duplicate entries when
|
||||
a status is visited more than once (max one entry per status).
|
||||
"""
|
||||
from datetime import datetime, timezone
|
||||
db = get_firestore()
|
||||
docs = list(db.collection("devices").where("serial_number", "==", sn).limit(1).stream())
|
||||
if not docs:
|
||||
raise HTTPException(status_code=404, detail="Device not found")
|
||||
doc_ref = docs[0].reference
|
||||
data = docs[0].to_dict() or {}
|
||||
history = list(data.get("lifecycle_history") or [])
|
||||
|
||||
new_entry = {
|
||||
"status_id": body.status_id,
|
||||
"date": body.date or datetime.now(timezone.utc).isoformat(),
|
||||
"note": body.note,
|
||||
"set_by": user.email,
|
||||
}
|
||||
|
||||
# Overwrite existing entry for this status if present, else append
|
||||
existing_idx = next(
|
||||
(i for i, e in enumerate(history) if e.get("status_id") == body.status_id),
|
||||
None,
|
||||
)
|
||||
if existing_idx is not None:
|
||||
history[existing_idx] = new_entry
|
||||
else:
|
||||
history.append(new_entry)
|
||||
|
||||
doc_ref.update({"lifecycle_history": history})
|
||||
from manufacturing.service import _doc_to_inventory_item
|
||||
return _doc_to_inventory_item(doc_ref.get())
|
||||
|
||||
|
||||
@router.delete("/devices/{sn}/lifecycle/{index}", response_model=DeviceInventoryItem)
|
||||
async def delete_lifecycle_entry(
|
||||
sn: str,
|
||||
index: int,
|
||||
user: TokenPayload = Depends(require_permission("manufacturing", "edit")),
|
||||
):
|
||||
"""Delete a lifecycle history entry by index. Cannot delete the entry for the current status."""
|
||||
db = get_firestore()
|
||||
docs = list(db.collection("devices").where("serial_number", "==", sn).limit(1).stream())
|
||||
if not docs:
|
||||
raise HTTPException(status_code=404, detail="Device not found")
|
||||
doc_ref = docs[0].reference
|
||||
data = docs[0].to_dict() or {}
|
||||
history = data.get("lifecycle_history") or []
|
||||
if index < 0 or index >= len(history):
|
||||
raise HTTPException(status_code=400, detail="Invalid lifecycle entry index")
|
||||
current_status = data.get("mfg_status", "")
|
||||
if history[index].get("status_id") == current_status:
|
||||
raise HTTPException(status_code=400, detail="Cannot delete the entry for the current status. Change the status first.")
|
||||
history.pop(index)
|
||||
doc_ref.update({"lifecycle_history": history})
|
||||
from manufacturing.service import _doc_to_inventory_item
|
||||
return _doc_to_inventory_item(doc_ref.get())
|
||||
|
||||
|
||||
@router.get("/devices/{sn}/nvs.bin")
|
||||
async def download_nvs(
|
||||
sn: str,
|
||||
hw_type_override: Optional[str] = Query(None, description="Override hw_type written to NVS (for bespoke firmware)"),
|
||||
hw_revision_override: Optional[str] = Query(None, description="Override hw_revision written to NVS (for bespoke firmware)"),
|
||||
nvs_schema: Optional[str] = Query(None, description="NVS schema to use: 'legacy' or 'new' (default)"),
|
||||
user: TokenPayload = Depends(require_permission("manufacturing", "view")),
|
||||
):
|
||||
binary = service.get_nvs_binary(sn, hw_type_override=hw_type_override, hw_revision_override=hw_revision_override, legacy=(nvs_schema == "legacy"))
|
||||
binary = service.get_nvs_binary(sn)
|
||||
await audit.log_action(
|
||||
admin_user=user.email,
|
||||
action="device_flashed",
|
||||
@@ -296,15 +120,12 @@ async def assign_device(
|
||||
body: DeviceAssign,
|
||||
user: TokenPayload = Depends(require_permission("manufacturing", "edit")),
|
||||
):
|
||||
try:
|
||||
result = service.assign_device(sn, body)
|
||||
except NotFoundError as e:
|
||||
raise HTTPException(status_code=404, detail=str(e))
|
||||
await audit.log_action(
|
||||
admin_user=user.email,
|
||||
action="device_assigned",
|
||||
serial_number=sn,
|
||||
detail={"customer_id": body.customer_id},
|
||||
detail={"customer_email": body.customer_email, "customer_name": body.customer_name},
|
||||
)
|
||||
return result
|
||||
|
||||
@@ -330,91 +151,6 @@ async def delete_device(
|
||||
)
|
||||
|
||||
|
||||
@router.post("/devices/{sn}/email/manufactured", status_code=204)
|
||||
async def send_manufactured_email(
|
||||
sn: str,
|
||||
user: TokenPayload = Depends(require_permission("manufacturing", "edit")),
|
||||
):
|
||||
"""Send the 'device manufactured' notification to the assigned customer's email."""
|
||||
db = get_firestore()
|
||||
docs = list(db.collection("devices").where("serial_number", "==", sn).limit(1).stream())
|
||||
if not docs:
|
||||
raise HTTPException(status_code=404, detail="Device not found")
|
||||
data = docs[0].to_dict() or {}
|
||||
customer_id = data.get("customer_id")
|
||||
if not customer_id:
|
||||
raise HTTPException(status_code=400, detail="No customer assigned to this device")
|
||||
customer_doc = db.collection("crm_customers").document(customer_id).get()
|
||||
if not customer_doc.exists:
|
||||
raise HTTPException(status_code=404, detail="Assigned customer not found")
|
||||
cdata = customer_doc.to_dict() or {}
|
||||
email = cdata.get("email")
|
||||
if not email:
|
||||
raise HTTPException(status_code=400, detail="Customer has no email address")
|
||||
name_parts = [cdata.get("name") or "", cdata.get("surname") or ""]
|
||||
customer_name = " ".join(p for p in name_parts if p).strip() or None
|
||||
hw_family = data.get("hw_family") or data.get("hw_type") or ""
|
||||
from utils.emails.device_mfged_mail import send_device_manufactured_email
|
||||
send_device_manufactured_email(
|
||||
customer_email=email,
|
||||
serial_number=sn,
|
||||
device_name=hw_family.replace("_", " ").title(),
|
||||
customer_name=customer_name,
|
||||
)
|
||||
await audit.log_action(
|
||||
admin_user=user.email,
|
||||
action="email_manufactured_sent",
|
||||
serial_number=sn,
|
||||
detail={"recipient": email},
|
||||
)
|
||||
|
||||
|
||||
@router.post("/devices/{sn}/email/assigned", status_code=204)
|
||||
async def send_assigned_email(
|
||||
sn: str,
|
||||
user: TokenPayload = Depends(require_permission("manufacturing", "edit")),
|
||||
):
|
||||
"""Send the 'device assigned / app instructions' email to the assigned user(s)."""
|
||||
db = get_firestore()
|
||||
docs = list(db.collection("devices").where("serial_number", "==", sn).limit(1).stream())
|
||||
if not docs:
|
||||
raise HTTPException(status_code=404, detail="Device not found")
|
||||
data = docs[0].to_dict() or {}
|
||||
user_list = data.get("user_list") or []
|
||||
if not user_list:
|
||||
raise HTTPException(status_code=400, detail="No users assigned to this device")
|
||||
hw_family = data.get("hw_family") or data.get("hw_type") or ""
|
||||
device_name = hw_family.replace("_", " ").title()
|
||||
from utils.emails.device_assigned_mail import send_device_assigned_email
|
||||
errors = []
|
||||
for uid in user_list:
|
||||
try:
|
||||
user_doc = db.collection("users").document(uid).get()
|
||||
if not user_doc.exists:
|
||||
continue
|
||||
udata = user_doc.to_dict() or {}
|
||||
email = udata.get("email")
|
||||
if not email:
|
||||
continue
|
||||
display_name = udata.get("display_name") or udata.get("name") or None
|
||||
send_device_assigned_email(
|
||||
user_email=email,
|
||||
serial_number=sn,
|
||||
device_name=device_name,
|
||||
user_name=display_name,
|
||||
)
|
||||
except Exception as exc:
|
||||
errors.append(str(exc))
|
||||
if errors:
|
||||
raise HTTPException(status_code=500, detail=f"Some emails failed: {'; '.join(errors)}")
|
||||
await audit.log_action(
|
||||
admin_user=user.email,
|
||||
action="email_assigned_sent",
|
||||
serial_number=sn,
|
||||
detail={"user_count": len(user_list)},
|
||||
)
|
||||
|
||||
|
||||
@router.delete("/devices", status_code=200)
|
||||
async def delete_unprovisioned(
|
||||
user: TokenPayload = Depends(require_permission("manufacturing", "delete")),
|
||||
@@ -439,123 +175,3 @@ def redirect_firmware(
|
||||
"""
|
||||
url = service.get_firmware_url(sn)
|
||||
return RedirectResponse(url=url, status_code=302)
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
# Flash assets — bootloader.bin and partitions.bin per hw_type
|
||||
# These are the binaries that must be flashed at fixed addresses during full
|
||||
# provisioning (0x1000 bootloader, 0x8000 partition table).
|
||||
# They are NOT flashed during OTA updates — only during initial provisioning.
|
||||
# Upload once per hw_type after each PlatformIO build that changes the layout.
|
||||
# ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
@router.get("/flash-assets")
|
||||
def list_flash_assets(
|
||||
_user: TokenPayload = Depends(require_permission("manufacturing", "view")),
|
||||
):
|
||||
"""Return asset status for all known board types (and any discovered bespoke UIDs).
|
||||
|
||||
Checks the filesystem directly — no database involved.
|
||||
Each entry contains: hw_type, bootloader (exists, size, uploaded_at), partitions (same), note.
|
||||
"""
|
||||
return {"assets": service.list_flash_assets()}
|
||||
|
||||
|
||||
@router.delete("/flash-assets/{hw_type}/{asset}", status_code=204)
|
||||
async def delete_flash_asset(
|
||||
hw_type: str,
|
||||
asset: str,
|
||||
user: TokenPayload = Depends(require_permission("manufacturing", "delete")),
|
||||
):
|
||||
"""Delete a single flash asset file (bootloader.bin or partitions.bin)."""
|
||||
if asset not in VALID_FLASH_ASSETS:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid asset. Must be one of: {', '.join(sorted(VALID_FLASH_ASSETS))}")
|
||||
try:
|
||||
service.delete_flash_asset(hw_type, asset)
|
||||
except NotFoundError as e:
|
||||
raise HTTPException(status_code=404, detail=str(e))
|
||||
await audit.log_action(
|
||||
admin_user=user.email,
|
||||
action="flash_asset_deleted",
|
||||
detail={"hw_type": hw_type, "asset": asset},
|
||||
)
|
||||
|
||||
|
||||
class FlashAssetNoteBody(BaseModel):
|
||||
note: str
|
||||
|
||||
|
||||
@router.put("/flash-assets/{hw_type}/note", status_code=204)
|
||||
async def set_flash_asset_note(
|
||||
hw_type: str,
|
||||
body: FlashAssetNoteBody,
|
||||
_user: TokenPayload = Depends(require_permission("manufacturing", "edit")),
|
||||
):
|
||||
"""Save (or overwrite) the note for a hw_type's flash asset set.
|
||||
|
||||
The note is stored as note.txt next to the binary files.
|
||||
Pass an empty string to clear the note.
|
||||
"""
|
||||
service.set_flash_asset_note(hw_type, body.note)
|
||||
|
||||
|
||||
@router.post("/flash-assets/{hw_type}/{asset}", status_code=204)
|
||||
async def upload_flash_asset(
|
||||
hw_type: str,
|
||||
asset: str,
|
||||
file: UploadFile = File(...),
|
||||
_user: TokenPayload = Depends(require_permission("manufacturing", "add")),
|
||||
):
|
||||
"""Upload a bootloader.bin or partitions.bin for a given hw_type.
|
||||
|
||||
These are build artifacts from PlatformIO (.pio/build/{env}/bootloader.bin
|
||||
and .pio/build/{env}/partitions.bin). Upload them once per hw_type after
|
||||
each PlatformIO build that changes the partition layout.
|
||||
"""
|
||||
# hw_type can be a standard board type OR a bespoke UID (any non-empty slug)
|
||||
if not hw_type or len(hw_type) > 128:
|
||||
raise HTTPException(status_code=400, detail="Invalid hw_type/bespoke UID.")
|
||||
if asset not in VALID_FLASH_ASSETS:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid asset. Must be one of: {', '.join(sorted(VALID_FLASH_ASSETS))}")
|
||||
data = await file.read()
|
||||
service.save_flash_asset(hw_type, asset, data)
|
||||
|
||||
|
||||
@router.get("/devices/{sn}/bootloader.bin")
|
||||
def download_bootloader(
|
||||
sn: str,
|
||||
hw_type_override: Optional[str] = Query(None, description="Override hw_type for flash asset lookup (for bespoke firmware)"),
|
||||
_user: TokenPayload = Depends(require_permission("manufacturing", "view")),
|
||||
):
|
||||
"""Return the bootloader.bin for this device's hw_type (flashed at 0x1000)."""
|
||||
item = service.get_device_by_sn(sn)
|
||||
hw_type = hw_type_override or item.hw_type
|
||||
try:
|
||||
data = service.get_flash_asset(hw_type, "bootloader.bin")
|
||||
except NotFoundError as e:
|
||||
raise HTTPException(status_code=404, detail=str(e))
|
||||
return Response(
|
||||
content=data,
|
||||
media_type="application/octet-stream",
|
||||
headers={"Content-Disposition": f'attachment; filename="bootloader_{hw_type}.bin"'},
|
||||
)
|
||||
|
||||
|
||||
@router.get("/devices/{sn}/partitions.bin")
|
||||
def download_partitions(
|
||||
sn: str,
|
||||
hw_type_override: Optional[str] = Query(None, description="Override hw_type for flash asset lookup (for bespoke firmware)"),
|
||||
_user: TokenPayload = Depends(require_permission("manufacturing", "view")),
|
||||
):
|
||||
"""Return the partitions.bin for this device's hw_type (flashed at 0x8000)."""
|
||||
item = service.get_device_by_sn(sn)
|
||||
hw_type = hw_type_override or item.hw_type
|
||||
try:
|
||||
data = service.get_flash_asset(hw_type, "partitions.bin")
|
||||
except NotFoundError as e:
|
||||
raise HTTPException(status_code=404, detail=str(e))
|
||||
return Response(
|
||||
content=data,
|
||||
media_type="application/octet-stream",
|
||||
headers={"Content-Disposition": f'attachment; filename="partitions_{hw_type}.bin"'},
|
||||
)
|
||||
|
||||
@@ -2,11 +2,9 @@ import logging
|
||||
import random
|
||||
import string
|
||||
from datetime import datetime, timezone
|
||||
from pathlib import Path
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
from config import settings
|
||||
from shared.firebase import get_db
|
||||
from shared.exceptions import NotFoundError
|
||||
from utils.serial_number import generate_serial
|
||||
@@ -33,18 +31,6 @@ def _get_existing_sns(db) -> set:
|
||||
return existing
|
||||
|
||||
|
||||
def _resolve_user_list(raw_list: list) -> list[str]:
|
||||
"""Convert user_list entries (DocumentReferences or path strings) to plain user ID strings."""
|
||||
from google.cloud.firestore_v1 import DocumentReference
|
||||
result = []
|
||||
for entry in raw_list:
|
||||
if isinstance(entry, DocumentReference):
|
||||
result.append(entry.id)
|
||||
elif isinstance(entry, str):
|
||||
result.append(entry.split("/")[-1])
|
||||
return result
|
||||
|
||||
|
||||
def _doc_to_inventory_item(doc) -> DeviceInventoryItem:
|
||||
data = doc.to_dict() or {}
|
||||
created_raw = data.get("created_at")
|
||||
@@ -64,9 +50,6 @@ def _doc_to_inventory_item(doc) -> DeviceInventoryItem:
|
||||
owner=data.get("owner"),
|
||||
assigned_to=data.get("assigned_to"),
|
||||
device_name=data.get("device_name") or None,
|
||||
lifecycle_history=data.get("lifecycle_history") or [],
|
||||
customer_id=data.get("customer_id"),
|
||||
user_list=_resolve_user_list(data.get("user_list") or []),
|
||||
)
|
||||
|
||||
|
||||
@@ -95,19 +78,11 @@ def create_batch(data: BatchCreate) -> BatchResponse:
|
||||
"created_at": now,
|
||||
"owner": None,
|
||||
"assigned_to": None,
|
||||
"user_list": [],
|
||||
"users_list": [],
|
||||
# Legacy fields left empty so existing device views don't break
|
||||
"device_name": "",
|
||||
"device_location": "",
|
||||
"is_Online": False,
|
||||
"lifecycle_history": [
|
||||
{
|
||||
"status_id": "manufactured",
|
||||
"date": now.isoformat(),
|
||||
"note": None,
|
||||
"set_by": None,
|
||||
}
|
||||
],
|
||||
})
|
||||
serial_numbers.append(sn)
|
||||
|
||||
@@ -158,38 +133,14 @@ def get_device_by_sn(sn: str) -> DeviceInventoryItem:
|
||||
return _doc_to_inventory_item(docs[0])
|
||||
|
||||
|
||||
def update_device_status(sn: str, data: DeviceStatusUpdate, set_by: str | None = None) -> DeviceInventoryItem:
|
||||
def update_device_status(sn: str, data: DeviceStatusUpdate) -> DeviceInventoryItem:
|
||||
db = get_db()
|
||||
docs = list(db.collection(COLLECTION).where("serial_number", "==", sn).limit(1).stream())
|
||||
if not docs:
|
||||
raise NotFoundError("Device")
|
||||
|
||||
doc_ref = docs[0].reference
|
||||
doc_data = docs[0].to_dict() or {}
|
||||
now = datetime.now(timezone.utc).isoformat()
|
||||
|
||||
history = list(doc_data.get("lifecycle_history") or [])
|
||||
|
||||
# Upsert lifecycle entry — overwrite existing entry for this status if present
|
||||
new_entry = {
|
||||
"status_id": data.status.value,
|
||||
"date": now,
|
||||
"note": data.note if data.note else None,
|
||||
"set_by": set_by,
|
||||
}
|
||||
existing_idx = next(
|
||||
(i for i, e in enumerate(history) if e.get("status_id") == data.status.value),
|
||||
None,
|
||||
)
|
||||
if existing_idx is not None:
|
||||
history[existing_idx] = new_entry
|
||||
else:
|
||||
history.append(new_entry)
|
||||
|
||||
update = {
|
||||
"mfg_status": data.status.value,
|
||||
"lifecycle_history": history,
|
||||
}
|
||||
update = {"mfg_status": data.status.value}
|
||||
if data.note:
|
||||
update["mfg_status_note"] = data.note
|
||||
doc_ref.update(update)
|
||||
@@ -197,115 +148,47 @@ def update_device_status(sn: str, data: DeviceStatusUpdate, set_by: str | None =
|
||||
return _doc_to_inventory_item(doc_ref.get())
|
||||
|
||||
|
||||
def get_nvs_binary(sn: str, hw_type_override: str | None = None, hw_revision_override: str | None = None, legacy: bool = False) -> bytes:
|
||||
def get_nvs_binary(sn: str) -> bytes:
|
||||
item = get_device_by_sn(sn)
|
||||
return generate_nvs_binary(
|
||||
serial_number=item.serial_number,
|
||||
hw_family=hw_type_override if hw_type_override else item.hw_type,
|
||||
hw_revision=hw_revision_override if hw_revision_override else item.hw_version,
|
||||
legacy=legacy,
|
||||
hw_type=item.hw_type,
|
||||
hw_version=item.hw_version,
|
||||
)
|
||||
|
||||
|
||||
def assign_device(sn: str, data: DeviceAssign) -> DeviceInventoryItem:
|
||||
"""Assign a device to a customer by customer_id.
|
||||
from utils.email import send_device_assignment_invite
|
||||
|
||||
- Stores customer_id on the device doc.
|
||||
- Adds the device to the customer's owned_items list.
|
||||
- Sets mfg_status to 'sold' unless device is already 'claimed'.
|
||||
"""
|
||||
db = get_db()
|
||||
CRM_COLLECTION = "crm_customers"
|
||||
|
||||
# Get device doc
|
||||
docs = list(db.collection(COLLECTION).where("serial_number", "==", sn).limit(1).stream())
|
||||
if not docs:
|
||||
raise NotFoundError("Device")
|
||||
|
||||
doc_data = docs[0].to_dict() or {}
|
||||
doc_ref = docs[0].reference
|
||||
current_status = doc_data.get("mfg_status", "manufactured")
|
||||
|
||||
# Get customer doc
|
||||
customer_ref = db.collection(CRM_COLLECTION).document(data.customer_id)
|
||||
customer_doc = customer_ref.get()
|
||||
if not customer_doc.exists:
|
||||
raise NotFoundError("Customer")
|
||||
customer_data = customer_doc.to_dict() or {}
|
||||
|
||||
# Determine new status: don't downgrade claimed → sold
|
||||
new_status = current_status if current_status == "claimed" else "sold"
|
||||
|
||||
now = datetime.now(timezone.utc).isoformat()
|
||||
history = doc_data.get("lifecycle_history") or []
|
||||
history.append({
|
||||
"status_id": new_status,
|
||||
"date": now,
|
||||
"note": "Assigned to customer",
|
||||
"set_by": None,
|
||||
})
|
||||
|
||||
doc_ref.update({
|
||||
"customer_id": data.customer_id,
|
||||
"mfg_status": new_status,
|
||||
"lifecycle_history": history,
|
||||
"owner": data.customer_email,
|
||||
"assigned_to": data.customer_email,
|
||||
"mfg_status": "sold",
|
||||
})
|
||||
|
||||
# Add to customer's owned_items (avoid duplicates)
|
||||
owned_items = customer_data.get("owned_items", []) or []
|
||||
device_doc_id = docs[0].id
|
||||
already_assigned = any(
|
||||
item.get("type") == "console_device"
|
||||
and item.get("console_device", {}).get("device_id") == device_doc_id
|
||||
for item in owned_items
|
||||
hw_type = doc_data.get("hw_type", "")
|
||||
device_name = BOARD_TYPE_LABELS.get(hw_type, hw_type or "Device")
|
||||
|
||||
try:
|
||||
send_device_assignment_invite(
|
||||
customer_email=data.customer_email,
|
||||
serial_number=sn,
|
||||
device_name=device_name,
|
||||
customer_name=data.customer_name,
|
||||
)
|
||||
if not already_assigned:
|
||||
device_name = doc_data.get("device_name") or BOARD_TYPE_LABELS.get(doc_data.get("hw_type", ""), sn)
|
||||
owned_items.append({
|
||||
"type": "console_device",
|
||||
"console_device": {
|
||||
"device_id": device_doc_id,
|
||||
"serial_number": sn,
|
||||
"label": device_name,
|
||||
},
|
||||
})
|
||||
customer_ref.update({"owned_items": owned_items})
|
||||
except Exception as exc:
|
||||
logger.error("Assignment succeeded but email failed for %s → %s: %s", sn, data.customer_email, exc)
|
||||
|
||||
return _doc_to_inventory_item(doc_ref.get())
|
||||
|
||||
|
||||
def search_customers(q: str) -> list:
|
||||
"""Search crm_customers by name, email, phone, organization, or tags."""
|
||||
db = get_db()
|
||||
CRM_COLLECTION = "crm_customers"
|
||||
docs = db.collection(CRM_COLLECTION).stream()
|
||||
results = []
|
||||
q_lower = q.lower().strip()
|
||||
for doc in docs:
|
||||
data = doc.to_dict() or {}
|
||||
loc = data.get("location") or {}
|
||||
loc = loc if isinstance(loc, dict) else {}
|
||||
city = loc.get("city") or ""
|
||||
searchable = " ".join(filter(None, [
|
||||
data.get("name"), data.get("surname"),
|
||||
data.get("email"), data.get("phone"), data.get("organization"),
|
||||
loc.get("address"), loc.get("city"), loc.get("postal_code"),
|
||||
loc.get("region"), loc.get("country"),
|
||||
" ".join(data.get("tags") or []),
|
||||
])).lower()
|
||||
if not q_lower or q_lower in searchable:
|
||||
results.append({
|
||||
"id": doc.id,
|
||||
"name": data.get("name") or "",
|
||||
"surname": data.get("surname") or "",
|
||||
"email": data.get("email") or "",
|
||||
"organization": data.get("organization") or "",
|
||||
"phone": data.get("phone") or "",
|
||||
"city": city or "",
|
||||
})
|
||||
return results
|
||||
|
||||
|
||||
def get_stats() -> ManufacturingStats:
|
||||
db = get_db()
|
||||
docs = list(db.collection(COLLECTION).stream())
|
||||
@@ -387,105 +270,6 @@ def delete_unprovisioned_devices() -> list[str]:
|
||||
return deleted
|
||||
|
||||
|
||||
KNOWN_HW_TYPES = ["vesper", "vesper_plus", "vesper_pro", "agnus", "agnus_mini", "chronos", "chronos_pro"]
|
||||
FLASH_ASSET_FILES = ["bootloader.bin", "partitions.bin"]
|
||||
|
||||
|
||||
def _flash_asset_path(hw_type: str, asset: str) -> Path:
|
||||
"""Return path to a flash asset (bootloader.bin or partitions.bin) for a given hw_type."""
|
||||
return Path(settings.flash_assets_storage_path) / hw_type / asset
|
||||
|
||||
|
||||
def _flash_asset_info(hw_type: str) -> dict:
|
||||
"""Build the asset info dict for a single hw_type by inspecting the filesystem."""
|
||||
base = Path(settings.flash_assets_storage_path) / hw_type
|
||||
note_path = base / "note.txt"
|
||||
note = note_path.read_text(encoding="utf-8").strip() if note_path.exists() else ""
|
||||
|
||||
files = {}
|
||||
for fname in FLASH_ASSET_FILES:
|
||||
p = base / fname
|
||||
if p.exists():
|
||||
stat = p.stat()
|
||||
files[fname] = {
|
||||
"exists": True,
|
||||
"size_bytes": stat.st_size,
|
||||
"uploaded_at": datetime.fromtimestamp(stat.st_mtime, tz=timezone.utc).isoformat(),
|
||||
}
|
||||
else:
|
||||
files[fname] = {"exists": False, "size_bytes": None, "uploaded_at": None}
|
||||
|
||||
return {
|
||||
"hw_type": hw_type,
|
||||
"bootloader": files["bootloader.bin"],
|
||||
"partitions": files["partitions.bin"],
|
||||
"note": note,
|
||||
}
|
||||
|
||||
|
||||
def list_flash_assets() -> list:
|
||||
"""Return asset status for all known board types plus any discovered bespoke directories."""
|
||||
base = Path(settings.flash_assets_storage_path)
|
||||
results = []
|
||||
|
||||
# Always include all known hw types, even if no files uploaded yet
|
||||
seen = set(KNOWN_HW_TYPES)
|
||||
for hw_type in KNOWN_HW_TYPES:
|
||||
results.append(_flash_asset_info(hw_type))
|
||||
|
||||
# Discover bespoke directories (anything in storage/flash_assets/ not in known list)
|
||||
if base.exists():
|
||||
for entry in sorted(base.iterdir()):
|
||||
if entry.is_dir() and entry.name not in seen:
|
||||
seen.add(entry.name)
|
||||
info = _flash_asset_info(entry.name)
|
||||
info["is_bespoke"] = True
|
||||
results.append(info)
|
||||
|
||||
# Mark known types
|
||||
for r in results:
|
||||
r.setdefault("is_bespoke", False)
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def save_flash_asset(hw_type: str, asset: str, data: bytes) -> Path:
|
||||
"""Persist a flash asset binary. asset must be 'bootloader.bin' or 'partitions.bin'."""
|
||||
if asset not in ("bootloader.bin", "partitions.bin"):
|
||||
raise ValueError(f"Unknown flash asset: {asset}")
|
||||
path = _flash_asset_path(hw_type, asset)
|
||||
path.parent.mkdir(parents=True, exist_ok=True)
|
||||
path.write_bytes(data)
|
||||
return path
|
||||
|
||||
|
||||
def delete_flash_asset(hw_type: str, asset: str) -> None:
|
||||
"""Delete a flash asset file. Raises NotFoundError if not present."""
|
||||
path = _flash_asset_path(hw_type, asset)
|
||||
if not path.exists():
|
||||
raise NotFoundError(f"Flash asset '{asset}' for '{hw_type}' not found")
|
||||
path.unlink()
|
||||
|
||||
|
||||
def set_flash_asset_note(hw_type: str, note: str) -> None:
|
||||
"""Write (or clear) the note for a hw_type's flash asset directory."""
|
||||
base = Path(settings.flash_assets_storage_path) / hw_type
|
||||
base.mkdir(parents=True, exist_ok=True)
|
||||
note_path = base / "note.txt"
|
||||
if note.strip():
|
||||
note_path.write_text(note.strip(), encoding="utf-8")
|
||||
elif note_path.exists():
|
||||
note_path.unlink()
|
||||
|
||||
|
||||
def get_flash_asset(hw_type: str, asset: str) -> bytes:
|
||||
"""Load a flash asset binary. Raises NotFoundError if not uploaded yet."""
|
||||
path = _flash_asset_path(hw_type, asset)
|
||||
if not path.exists():
|
||||
raise NotFoundError(f"Flash asset '{asset}' for hw_type '{hw_type}' — upload it first via POST /api/manufacturing/flash-assets/{{hw_type}}/{{asset}}")
|
||||
return path.read_bytes()
|
||||
|
||||
|
||||
def get_firmware_url(sn: str) -> str:
|
||||
"""Return the FastAPI download URL for the latest stable firmware for this device's hw_type."""
|
||||
from firmware.service import get_latest
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import json
|
||||
import logging
|
||||
from database import get_db
|
||||
from mqtt.database import get_db
|
||||
|
||||
logger = logging.getLogger("melodies.database")
|
||||
|
||||
|
||||
@@ -30,7 +30,6 @@ class MelodyInfo(BaseModel):
|
||||
isTrueRing: bool = False
|
||||
previewURL: str = ""
|
||||
archetype_csv: Optional[str] = None
|
||||
outdated_archetype: bool = False
|
||||
|
||||
|
||||
class MelodyAttributes(BaseModel):
|
||||
|
||||
@@ -146,23 +146,6 @@ async def get_files(
|
||||
return service.get_storage_files(melody_id, melody.uid)
|
||||
|
||||
|
||||
@router.patch("/{melody_id}/set-outdated", response_model=MelodyInDB)
|
||||
async def set_outdated(
|
||||
melody_id: str,
|
||||
outdated: bool = Query(...),
|
||||
_user: TokenPayload = Depends(require_permission("melodies", "edit")),
|
||||
):
|
||||
"""Manually set or clear the outdated_archetype flag on a melody."""
|
||||
melody = await service.get_melody(melody_id)
|
||||
info = melody.information.model_dump()
|
||||
info["outdated_archetype"] = outdated
|
||||
return await service.update_melody(
|
||||
melody_id,
|
||||
MelodyUpdate(information=MelodyInfo(**info)),
|
||||
actor_name=_user.name,
|
||||
)
|
||||
|
||||
|
||||
@router.get("/{melody_id}/download/binary")
|
||||
async def download_binary_file(
|
||||
melody_id: str,
|
||||
|
||||
@@ -1,178 +0,0 @@
|
||||
"""
|
||||
One-time migration script: convert legacy negotiating/has_problem flags to new structure.
|
||||
|
||||
Run AFTER deploying the new backend code:
|
||||
cd backend && python migrate_customer_flags.py
|
||||
|
||||
What it does:
|
||||
1. For each customer with negotiating=True:
|
||||
- Creates an order subcollection document with status="negotiating"
|
||||
- Sets relationship_status="active" (only if currently "lead" or "prospect")
|
||||
2. For each customer with has_problem=True:
|
||||
- Appends one entry to technical_issues with active=True
|
||||
3. Removes negotiating and has_problem fields from every customer document
|
||||
4. Initialises relationship_status="lead" on any customer missing it
|
||||
5. Recomputes crm_summary for each affected customer
|
||||
"""
|
||||
|
||||
import sys
|
||||
import os
|
||||
import uuid
|
||||
from datetime import datetime
|
||||
|
||||
# Make sure we can import backend modules
|
||||
sys.path.insert(0, os.path.dirname(__file__))
|
||||
|
||||
from shared.firebase import init_firebase, get_db
|
||||
|
||||
init_firebase()
|
||||
|
||||
|
||||
def migrate():
|
||||
db = get_db()
|
||||
customers_ref = db.collection("crm_customers")
|
||||
docs = list(customers_ref.stream())
|
||||
print(f"Found {len(docs)} customer documents.")
|
||||
|
||||
migrated_neg = 0
|
||||
migrated_prob = 0
|
||||
now = datetime.utcnow().isoformat()
|
||||
|
||||
for doc in docs:
|
||||
data = doc.to_dict() or {}
|
||||
customer_id = doc.id
|
||||
updates = {}
|
||||
changed = False
|
||||
|
||||
# ── 1. Initialise new fields if missing ──────────────────────────────
|
||||
if "relationship_status" not in data:
|
||||
updates["relationship_status"] = "lead"
|
||||
changed = True
|
||||
if "technical_issues" not in data:
|
||||
updates["technical_issues"] = []
|
||||
changed = True
|
||||
if "install_support" not in data:
|
||||
updates["install_support"] = []
|
||||
changed = True
|
||||
if "transaction_history" not in data:
|
||||
updates["transaction_history"] = []
|
||||
changed = True
|
||||
|
||||
# ── 2. Migrate negotiating flag ───────────────────────────────────────
|
||||
if data.get("negotiating"):
|
||||
order_id = str(uuid.uuid4())
|
||||
order_data = {
|
||||
"customer_id": customer_id,
|
||||
"order_number": f"ORD-{datetime.utcnow().year}-001-migrated",
|
||||
"title": "Migrated from legacy negotiating flag",
|
||||
"created_by": "system",
|
||||
"status": "negotiating",
|
||||
"status_updated_date": now,
|
||||
"status_updated_by": "system",
|
||||
"items": [],
|
||||
"subtotal": 0,
|
||||
"discount": None,
|
||||
"total_price": 0,
|
||||
"currency": "EUR",
|
||||
"shipping": None,
|
||||
"payment_status": {
|
||||
"required_amount": 0,
|
||||
"received_amount": 0,
|
||||
"balance_due": 0,
|
||||
"advance_required": False,
|
||||
"advance_amount": None,
|
||||
"payment_complete": False,
|
||||
},
|
||||
"invoice_path": None,
|
||||
"notes": "Migrated from legacy negotiating flag",
|
||||
"timeline": [{
|
||||
"date": now,
|
||||
"type": "note",
|
||||
"note": "Migrated from legacy negotiating flag",
|
||||
"updated_by": "system",
|
||||
}],
|
||||
"created_at": now,
|
||||
"updated_at": now,
|
||||
}
|
||||
customers_ref.document(customer_id).collection("orders").document(order_id).set(order_data)
|
||||
|
||||
current_rel = updates.get("relationship_status") or data.get("relationship_status", "lead")
|
||||
if current_rel in ("lead", "prospect"):
|
||||
updates["relationship_status"] = "active"
|
||||
|
||||
migrated_neg += 1
|
||||
print(f" [{customer_id}] Created negotiating order, set relationship_status=active")
|
||||
|
||||
# ── 3. Migrate has_problem flag ───────────────────────────────────────
|
||||
if data.get("has_problem"):
|
||||
existing_issues = list(updates.get("technical_issues") or data.get("technical_issues") or [])
|
||||
existing_issues.append({
|
||||
"active": True,
|
||||
"opened_date": data.get("updated_at") or now,
|
||||
"resolved_date": None,
|
||||
"note": "Migrated from legacy has_problem flag",
|
||||
"opened_by": "system",
|
||||
"resolved_by": None,
|
||||
})
|
||||
updates["technical_issues"] = existing_issues
|
||||
migrated_prob += 1
|
||||
changed = True
|
||||
print(f" [{customer_id}] Appended technical issue from has_problem flag")
|
||||
|
||||
# ── 4. Remove legacy fields ───────────────────────────────────────────
|
||||
from google.cloud.firestore_v1 import DELETE_FIELD
|
||||
if "negotiating" in data:
|
||||
updates["negotiating"] = DELETE_FIELD
|
||||
changed = True
|
||||
if "has_problem" in data:
|
||||
updates["has_problem"] = DELETE_FIELD
|
||||
changed = True
|
||||
|
||||
if changed or data.get("negotiating") or data.get("has_problem"):
|
||||
updates["updated_at"] = now
|
||||
customers_ref.document(customer_id).update(updates)
|
||||
|
||||
# ── 5. Recompute crm_summary ──────────────────────────────────────────
|
||||
# Re-read updated doc to compute summary
|
||||
updated_doc = customers_ref.document(customer_id).get()
|
||||
updated_data = updated_doc.to_dict() or {}
|
||||
|
||||
issues = updated_data.get("technical_issues") or []
|
||||
active_issues = [i for i in issues if i.get("active")]
|
||||
support = updated_data.get("install_support") or []
|
||||
active_support = [s for s in support if s.get("active")]
|
||||
|
||||
TERMINAL = {"declined", "complete"}
|
||||
active_order_status = None
|
||||
active_order_status_date = None
|
||||
active_order_title = None
|
||||
latest_date = ""
|
||||
for odoc in customers_ref.document(customer_id).collection("orders").stream():
|
||||
odata = odoc.to_dict() or {}
|
||||
if odata.get("status") not in TERMINAL:
|
||||
upd = odata.get("status_updated_date") or odata.get("created_at") or ""
|
||||
if upd > latest_date:
|
||||
latest_date = upd
|
||||
active_order_status = odata.get("status")
|
||||
active_order_status_date = upd
|
||||
active_order_title = odata.get("title")
|
||||
|
||||
summary = {
|
||||
"active_order_status": active_order_status,
|
||||
"active_order_status_date": active_order_status_date,
|
||||
"active_order_title": active_order_title,
|
||||
"active_issues_count": len(active_issues),
|
||||
"latest_issue_date": max((i.get("opened_date") or "") for i in active_issues) if active_issues else None,
|
||||
"active_support_count": len(active_support),
|
||||
"latest_support_date": max((s.get("opened_date") or "") for s in active_support) if active_support else None,
|
||||
}
|
||||
customers_ref.document(customer_id).update({"crm_summary": summary})
|
||||
|
||||
print(f"\nMigration complete.")
|
||||
print(f" Negotiating orders created: {migrated_neg}")
|
||||
print(f" Technical issues created: {migrated_prob}")
|
||||
print(f" Total customers processed: {len(docs)}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
migrate()
|
||||
@@ -2,11 +2,10 @@ import aiosqlite
|
||||
import asyncio
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from config import settings
|
||||
|
||||
logger = logging.getLogger("database")
|
||||
logger = logging.getLogger("mqtt.database")
|
||||
|
||||
_db: aiosqlite.Connection | None = None
|
||||
|
||||
@@ -163,8 +162,6 @@ SCHEMA_STATEMENTS = [
|
||||
quotation_id TEXT NOT NULL,
|
||||
product_id TEXT,
|
||||
description TEXT,
|
||||
description_en TEXT,
|
||||
description_gr TEXT,
|
||||
unit_type TEXT NOT NULL DEFAULT 'pcs',
|
||||
unit_cost REAL NOT NULL DEFAULT 0,
|
||||
discount_percent REAL NOT NULL DEFAULT 0,
|
||||
@@ -180,7 +177,6 @@ SCHEMA_STATEMENTS = [
|
||||
|
||||
async def init_db():
|
||||
global _db
|
||||
os.makedirs(os.path.dirname(os.path.abspath(settings.sqlite_db_path)), exist_ok=True)
|
||||
_db = await aiosqlite.connect(settings.sqlite_db_path)
|
||||
_db.row_factory = aiosqlite.Row
|
||||
for stmt in SCHEMA_STATEMENTS:
|
||||
@@ -201,13 +197,6 @@ async def init_db():
|
||||
"ALTER TABLE crm_quotations ADD COLUMN client_location TEXT",
|
||||
"ALTER TABLE crm_quotations ADD COLUMN client_phone TEXT",
|
||||
"ALTER TABLE crm_quotations ADD COLUMN client_email TEXT",
|
||||
"ALTER TABLE crm_quotations ADD COLUMN is_legacy INTEGER NOT NULL DEFAULT 0",
|
||||
"ALTER TABLE crm_quotations ADD COLUMN legacy_date TEXT",
|
||||
"ALTER TABLE crm_quotations ADD COLUMN legacy_pdf_path TEXT",
|
||||
"ALTER TABLE crm_media ADD COLUMN thumbnail_path TEXT",
|
||||
"ALTER TABLE crm_quotation_items ADD COLUMN description_en TEXT",
|
||||
"ALTER TABLE crm_quotation_items ADD COLUMN description_gr TEXT",
|
||||
"ALTER TABLE built_melodies ADD COLUMN is_builtin INTEGER NOT NULL DEFAULT 0",
|
||||
]
|
||||
for m in _migrations:
|
||||
try:
|
||||
@@ -1,5 +1,5 @@
|
||||
import logging
|
||||
import database as db
|
||||
from mqtt import database as db
|
||||
|
||||
logger = logging.getLogger("mqtt.logger")
|
||||
|
||||
|
||||
@@ -8,7 +8,7 @@ from mqtt.models import (
|
||||
CommandListResponse, HeartbeatEntry,
|
||||
)
|
||||
from mqtt.client import mqtt_manager
|
||||
import database as db
|
||||
from mqtt import database as db
|
||||
from datetime import datetime, timezone
|
||||
|
||||
router = APIRouter(prefix="/api/mqtt", tags=["mqtt"])
|
||||
|
||||
BIN
backend/mqtt_data.db
Normal file
@@ -1,214 +0,0 @@
|
||||
"""
|
||||
Public (no-auth) endpoints for CloudFlash and feature gate checks.
|
||||
"""
|
||||
from fastapi import APIRouter, HTTPException
|
||||
from fastapi.responses import Response
|
||||
from pydantic import BaseModel
|
||||
from typing import List, Optional
|
||||
|
||||
from settings.public_features_service import get_public_features
|
||||
from firmware.service import list_firmware
|
||||
from utils.nvs_generator import generate as generate_nvs
|
||||
from manufacturing.service import get_device_by_sn
|
||||
from shared.exceptions import NotFoundError
|
||||
|
||||
router = APIRouter(prefix="/api/public", tags=["public"])
|
||||
|
||||
|
||||
# ── Feature gate ──────────────────────────────────────────────────────────────
|
||||
|
||||
class CloudFlashStatus(BaseModel):
|
||||
enabled: bool
|
||||
|
||||
|
||||
@router.get("/cloudflash/status", response_model=CloudFlashStatus)
|
||||
async def cloudflash_status():
|
||||
"""Returns whether the CloudFlash public page is currently enabled."""
|
||||
settings = get_public_features()
|
||||
return CloudFlashStatus(enabled=settings.cloudflash_enabled)
|
||||
|
||||
|
||||
def _require_cloudflash_enabled():
|
||||
"""Raises 403 if CloudFlash is disabled."""
|
||||
settings = get_public_features()
|
||||
if not settings.cloudflash_enabled:
|
||||
raise HTTPException(status_code=403, detail="CloudFlash is currently disabled.")
|
||||
|
||||
|
||||
# ── Public firmware list ───────────────────────────────────────────────────────
|
||||
|
||||
class PublicFirmwareOption(BaseModel):
|
||||
hw_type: str
|
||||
hw_type_label: str
|
||||
channel: str
|
||||
version: str
|
||||
download_url: str
|
||||
|
||||
|
||||
HW_TYPE_LABELS = {
|
||||
"vesper": "Vesper",
|
||||
"vesper_plus": "Vesper Plus",
|
||||
"vesper_pro": "Vesper Pro",
|
||||
"agnus": "Agnus",
|
||||
"agnus_mini": "Agnus Mini",
|
||||
"chronos": "Chronos",
|
||||
"chronos_pro": "Chronos Pro",
|
||||
}
|
||||
|
||||
|
||||
@router.get("/cloudflash/firmware", response_model=List[PublicFirmwareOption])
|
||||
async def list_public_firmware():
|
||||
"""
|
||||
Returns all available firmware options (is_latest=True, non-bespoke, stable channel only).
|
||||
No authentication required — used by the public CloudFlash page.
|
||||
"""
|
||||
_require_cloudflash_enabled()
|
||||
|
||||
all_fw = list_firmware()
|
||||
options = []
|
||||
for fw in all_fw:
|
||||
if not fw.is_latest:
|
||||
continue
|
||||
if fw.hw_type == "bespoke":
|
||||
continue
|
||||
if fw.channel != "stable":
|
||||
continue
|
||||
options.append(PublicFirmwareOption(
|
||||
hw_type=fw.hw_type,
|
||||
hw_type_label=HW_TYPE_LABELS.get(fw.hw_type, fw.hw_type.replace("_", " ").title()),
|
||||
channel=fw.channel,
|
||||
version=fw.version,
|
||||
download_url=f"/api/firmware/{fw.hw_type}/{fw.channel}/{fw.version}/firmware.bin",
|
||||
))
|
||||
|
||||
# Sort by hw_type label
|
||||
options.sort(key=lambda x: x.hw_type_label)
|
||||
return options
|
||||
|
||||
|
||||
# ── Public serial number validation ──────────────────────────────────────────
|
||||
|
||||
class SerialValidationResult(BaseModel):
|
||||
valid: bool
|
||||
hw_type: Optional[str] = None
|
||||
hw_type_label: Optional[str] = None
|
||||
hw_version: Optional[str] = None
|
||||
|
||||
|
||||
@router.get("/cloudflash/validate-serial/{serial_number}", response_model=SerialValidationResult)
|
||||
async def validate_serial(serial_number: str):
|
||||
"""
|
||||
Check whether a serial number exists in the device database.
|
||||
Returns hw_type info if found so the frontend can confirm it matches the user's selection.
|
||||
No sensitive device data is returned.
|
||||
"""
|
||||
_require_cloudflash_enabled()
|
||||
|
||||
sn = serial_number.strip().upper()
|
||||
try:
|
||||
device = get_device_by_sn(sn)
|
||||
return SerialValidationResult(
|
||||
valid=True,
|
||||
hw_type=device.hw_type,
|
||||
hw_type_label=HW_TYPE_LABELS.get(device.hw_type, device.hw_type.replace("_", " ").title()),
|
||||
hw_version=device.hw_version,
|
||||
)
|
||||
except Exception:
|
||||
return SerialValidationResult(valid=False)
|
||||
|
||||
|
||||
# ── Public NVS generation ─────────────────────────────────────────────────────
|
||||
|
||||
class NvsRequest(BaseModel):
|
||||
serial_number: str
|
||||
hw_type: str
|
||||
hw_revision: str
|
||||
nvs_schema: str = "new" # "legacy" | "new"
|
||||
|
||||
@property
|
||||
def legacy(self) -> bool:
|
||||
return self.nvs_schema == "legacy"
|
||||
|
||||
|
||||
@router.post("/cloudflash/nvs.bin")
|
||||
async def generate_public_nvs(body: NvsRequest):
|
||||
"""
|
||||
Generate an NVS binary for a given serial number + hardware info.
|
||||
No authentication required — used by the public CloudFlash page for Full Wipe flash.
|
||||
The serial number is provided by the user (they read it from the sticker on their device).
|
||||
"""
|
||||
_require_cloudflash_enabled()
|
||||
|
||||
sn = body.serial_number.strip().upper()
|
||||
if not sn:
|
||||
raise HTTPException(status_code=422, detail="Serial number is required.")
|
||||
|
||||
hw_type = body.hw_type.strip().lower()
|
||||
hw_revision = body.hw_revision.strip()
|
||||
|
||||
if not hw_type or not hw_revision:
|
||||
raise HTTPException(status_code=422, detail="hw_type and hw_revision are required.")
|
||||
|
||||
try:
|
||||
nvs_bytes = generate_nvs(
|
||||
serial_number=sn,
|
||||
hw_family=hw_type,
|
||||
hw_revision=hw_revision,
|
||||
legacy=body.legacy,
|
||||
)
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=f"NVS generation failed: {str(e)}")
|
||||
|
||||
return Response(
|
||||
content=nvs_bytes,
|
||||
media_type="application/octet-stream",
|
||||
headers={"Content-Disposition": f'attachment; filename="{sn}_nvs.bin"'},
|
||||
)
|
||||
|
||||
|
||||
# ── Public flash assets (bootloader + partitions) ─────────────────────────────
|
||||
|
||||
@router.get("/cloudflash/{hw_type}/bootloader.bin")
|
||||
async def get_public_bootloader(hw_type: str):
|
||||
"""
|
||||
Serve the bootloader binary for a given hw_type.
|
||||
No authentication required — used by the public CloudFlash page.
|
||||
"""
|
||||
_require_cloudflash_enabled()
|
||||
|
||||
import os
|
||||
from config import settings as cfg
|
||||
from pathlib import Path
|
||||
|
||||
asset_path = Path(cfg.flash_assets_storage_path) / hw_type / "bootloader.bin"
|
||||
if not asset_path.exists():
|
||||
raise HTTPException(status_code=404, detail=f"Bootloader not found for {hw_type}.")
|
||||
|
||||
return Response(
|
||||
content=asset_path.read_bytes(),
|
||||
media_type="application/octet-stream",
|
||||
headers={"Content-Disposition": f'attachment; filename="bootloader_{hw_type}.bin"'},
|
||||
)
|
||||
|
||||
|
||||
@router.get("/cloudflash/{hw_type}/partitions.bin")
|
||||
async def get_public_partitions(hw_type: str):
|
||||
"""
|
||||
Serve the partition table binary for a given hw_type.
|
||||
No authentication required — used by the public CloudFlash page.
|
||||
"""
|
||||
_require_cloudflash_enabled()
|
||||
|
||||
import os
|
||||
from config import settings as cfg
|
||||
from pathlib import Path
|
||||
|
||||
asset_path = Path(cfg.flash_assets_storage_path) / hw_type / "partitions.bin"
|
||||
if not asset_path.exists():
|
||||
raise HTTPException(status_code=404, detail=f"Partition table not found for {hw_type}.")
|
||||
|
||||
return Response(
|
||||
content=asset_path.read_bytes(),
|
||||
media_type="application/octet-stream",
|
||||
headers={"Content-Disposition": f'attachment; filename="partitions_{hw_type}.bin"'},
|
||||
)
|
||||
@@ -13,5 +13,3 @@ resend==2.10.0
|
||||
httpx>=0.27.0
|
||||
weasyprint>=62.0
|
||||
jinja2>=3.1.0
|
||||
Pillow>=10.0.0
|
||||
pdf2image>=1.17.0
|
||||
@@ -1,10 +0,0 @@
|
||||
from pydantic import BaseModel
|
||||
from typing import Optional
|
||||
|
||||
|
||||
class PublicFeaturesSettings(BaseModel):
|
||||
cloudflash_enabled: bool = False
|
||||
|
||||
|
||||
class PublicFeaturesSettingsUpdate(BaseModel):
|
||||
cloudflash_enabled: Optional[bool] = None
|
||||
@@ -1,31 +0,0 @@
|
||||
from shared.firebase import get_db
|
||||
from settings.public_features_models import PublicFeaturesSettings, PublicFeaturesSettingsUpdate
|
||||
|
||||
COLLECTION = "admin_settings"
|
||||
DOC_ID = "public_features"
|
||||
|
||||
|
||||
def get_public_features() -> PublicFeaturesSettings:
|
||||
"""Get public features settings from Firestore. Creates defaults if not found."""
|
||||
db = get_db()
|
||||
doc = db.collection(COLLECTION).document(DOC_ID).get()
|
||||
if doc.exists:
|
||||
return PublicFeaturesSettings(**doc.to_dict())
|
||||
defaults = PublicFeaturesSettings()
|
||||
db.collection(COLLECTION).document(DOC_ID).set(defaults.model_dump())
|
||||
return defaults
|
||||
|
||||
|
||||
def update_public_features(data: PublicFeaturesSettingsUpdate) -> PublicFeaturesSettings:
|
||||
"""Update public features settings. Only provided fields are updated."""
|
||||
db = get_db()
|
||||
doc_ref = db.collection(COLLECTION).document(DOC_ID)
|
||||
doc = doc_ref.get()
|
||||
|
||||
existing = doc.to_dict() if doc.exists else PublicFeaturesSettings().model_dump()
|
||||
update_data = data.model_dump(exclude_none=True)
|
||||
existing.update(update_data)
|
||||
|
||||
normalized = PublicFeaturesSettings(**existing)
|
||||
doc_ref.set(normalized.model_dump())
|
||||
return normalized
|
||||
@@ -1,11 +1,8 @@
|
||||
from fastapi import APIRouter, Depends
|
||||
from auth.models import TokenPayload
|
||||
from auth.dependencies import require_permission, require_roles
|
||||
from auth.models import Role
|
||||
from auth.dependencies import require_permission
|
||||
from settings.models import MelodySettings, MelodySettingsUpdate
|
||||
from settings.public_features_models import PublicFeaturesSettings, PublicFeaturesSettingsUpdate
|
||||
from settings import service
|
||||
from settings import public_features_service
|
||||
|
||||
router = APIRouter(prefix="/api/settings", tags=["settings"])
|
||||
|
||||
@@ -23,20 +20,3 @@ async def update_melody_settings(
|
||||
_user: TokenPayload = Depends(require_permission("melodies", "edit")),
|
||||
):
|
||||
return service.update_melody_settings(body)
|
||||
|
||||
|
||||
# ── Public Features Settings (sysadmin / admin only) ─────────────────────────
|
||||
|
||||
@router.get("/public-features", response_model=PublicFeaturesSettings)
|
||||
async def get_public_features(
|
||||
_user: TokenPayload = Depends(require_roles(Role.sysadmin, Role.admin)),
|
||||
):
|
||||
return public_features_service.get_public_features()
|
||||
|
||||
|
||||
@router.put("/public-features", response_model=PublicFeaturesSettings)
|
||||
async def update_public_features(
|
||||
body: PublicFeaturesSettingsUpdate,
|
||||
_user: TokenPayload = Depends(require_roles(Role.sysadmin, Role.admin)),
|
||||
):
|
||||
return public_features_service.update_public_features(body)
|
||||
|
||||
@@ -464,7 +464,7 @@
|
||||
|
||||
<div class="client-block">
|
||||
<div class="block-title">{{ L_CLIENT }}</div>
|
||||
<table class="fields"><tbody>{% if customer.organization %}<tr><td class="lbl">{{ L_ORG }}</td><td class="val">{{ customer.organization }}</td></tr>{% endif %}{% set name_parts = [customer.title, customer.name, customer.surname] | select | list %}{% if name_parts %}<tr><td class="lbl">{{ L_CONTACT }}</td><td class="val">{{ name_parts | join(' ') }}</td></tr>{% endif %}{% if quotation.client_location %}<tr><td class="lbl">{{ L_ADDRESS }}</td><td class="val">{{ quotation.client_location }}</td></tr>{% elif customer.location %}{% set loc_parts = [customer.location.address, customer.location.city, customer.location.postal_code, customer.location.region, customer.location.country] | select | list %}{% if loc_parts %}<tr><td class="lbl">{{ L_ADDRESS }}</td><td class="val">{{ loc_parts | join(', ') }}</td></tr>{% endif %}{% endif %}{% if customer_email %}<tr><td class="lbl">Email</td><td class="val">{{ customer_email }}</td></tr>{% endif %}{% if customer_phone %}<tr><td class="lbl">{{ L_PHONE }}</td><td class="val">{{ customer_phone }}</td></tr>{% endif %}</tbody></table>
|
||||
<table class="fields"><tbody>{% if customer.organization %}<tr><td class="lbl">{{ L_ORG }}</td><td class="val">{{ customer.organization }}</td></tr>{% endif %}{% set name_parts = [customer.title, customer.name, customer.surname] | select | list %}{% if name_parts %}<tr><td class="lbl">{{ L_CONTACT }}</td><td class="val">{{ name_parts | join(' ') }}</td></tr>{% endif %}{% if customer.location %}{% set loc_parts = [customer.location.city, customer.location.region, customer.location.country] | select | list %}{% if loc_parts %}<tr><td class="lbl">{{ L_ADDRESS }}</td><td class="val">{{ loc_parts | join(', ') }}</td></tr>{% endif %}{% endif %}{% if customer_email %}<tr><td class="lbl">Email</td><td class="val">{{ customer_email }}</td></tr>{% endif %}{% if customer_phone %}<tr><td class="lbl">{{ L_PHONE }}</td><td class="val">{{ customer_phone }}</td></tr>{% endif %}</tbody></table>
|
||||
</div>
|
||||
|
||||
<div class="order-block">
|
||||
@@ -490,7 +490,7 @@
|
||||
<tbody>
|
||||
{% for item in quotation.items %}
|
||||
<tr>
|
||||
<td>{% if lang == 'gr' %}{{ item.description_gr or item.description or '' }}{% else %}{{ item.description_en or item.description or '' }}{% endif %}</td>
|
||||
<td>{{ item.description or '' }}</td>
|
||||
<td class="right">{{ item.unit_cost | format_money }}</td>
|
||||
<td class="center">
|
||||
{% if item.discount_percent and item.discount_percent > 0 %}
|
||||
|
||||
|
Before Width: | Height: | Size: 16 KiB |
@@ -1,220 +0,0 @@
|
||||
import logging
|
||||
import base64
|
||||
import os
|
||||
import resend
|
||||
from config import settings
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
_LOGO_PATH = os.path.join(os.path.dirname(__file__), "assets", "bell_systems_horizontal_darkMode.png")
|
||||
try:
|
||||
with open(_LOGO_PATH, "rb") as _f:
|
||||
_LOGO_B64 = base64.b64encode(_f.read()).decode()
|
||||
_LOGO_SRC = f"data:image/png;base64,{_LOGO_B64}"
|
||||
except Exception:
|
||||
_LOGO_SRC = ""
|
||||
|
||||
|
||||
def send_email(to: str, subject: str, html: str) -> None:
|
||||
"""Send a transactional email via Resend."""
|
||||
try:
|
||||
resend.api_key = settings.resend_api_key
|
||||
resend.Emails.send({
|
||||
"from": settings.email_from,
|
||||
"to": to,
|
||||
"subject": subject,
|
||||
"html": html,
|
||||
})
|
||||
logger.info("Email sent to %s — subject: %s", to, subject)
|
||||
except Exception as exc:
|
||||
logger.error("Failed to send email to %s: %s", to, exc)
|
||||
raise
|
||||
|
||||
|
||||
_OPT_IN_URL = "https://play.google.com/apps/testing/com.bellsystems.vesper"
|
||||
_APP_URL = "https://play.google.com/store/apps/details?id=com.bellsystems.vesper"
|
||||
|
||||
|
||||
def send_device_assigned_email(
|
||||
user_email: str,
|
||||
serial_number: str,
|
||||
device_name: str,
|
||||
user_name: str | None = None,
|
||||
) -> None:
|
||||
"""
|
||||
Notify a user that a BellSystems device has been assigned to their account,
|
||||
with links to opt in to the Vesper beta programme and download the app.
|
||||
"""
|
||||
greeting = f"Dear {user_name}," if user_name else "Dear valued customer,"
|
||||
|
||||
html = f"""<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>Your BellSystems Device Is Ready</title>
|
||||
</head>
|
||||
<body style="margin:0; padding:0; background-color:#0d1117; font-family:'Helvetica Neue', Helvetica, Arial, sans-serif;">
|
||||
<table width="100%" cellpadding="0" cellspacing="0" style="background-color:#0d1117; padding:40px 16px;">
|
||||
<tr>
|
||||
<td align="center">
|
||||
<table width="580" cellpadding="0" cellspacing="0"
|
||||
style="background-color:#161b22; border-radius:12px; overflow:hidden;
|
||||
box-shadow:0 4px 24px rgba(0,0,0,0.5); max-width:580px; width:100%;
|
||||
border:1px solid #30363d;">
|
||||
|
||||
<!-- Header with logo -->
|
||||
<tr>
|
||||
<td style="background-color:#0f172a; padding:32px 40px 28px; text-align:center;
|
||||
border-bottom:1px solid #21262d;">
|
||||
{"<img src='" + _LOGO_SRC + "' alt='BellSystems' width='180' style='display:block; margin:0 auto; max-width:180px;'>" if _LOGO_SRC else "<h1 style='color:#ffffff; margin:0; font-size:22px; font-weight:700; letter-spacing:1px;'>BELLSYSTEMS</h1>"}
|
||||
<p style="color:#64748b; margin:14px 0 0; font-size:11px; letter-spacing:2.5px;
|
||||
text-transform:uppercase; font-weight:600;">Device Activation</p>
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<!-- Body -->
|
||||
<tr>
|
||||
<td style="padding:36px 40px 28px;">
|
||||
|
||||
<p style="margin:0 0 24px; font-size:16px; color:#c9d1d9; font-weight:500;">
|
||||
{greeting}
|
||||
</p>
|
||||
|
||||
<p style="margin:0 0 18px; font-size:15px; color:#8b949e; line-height:1.75;">
|
||||
Exciting news — your
|
||||
<strong style="color:#c9d1d9;">BellSystems {device_name}</strong>
|
||||
has been assigned to your account and is ready to use!
|
||||
</p>
|
||||
|
||||
<p style="margin:0 0 28px; font-size:15px; color:#8b949e; line-height:1.75;">
|
||||
To get started, join the <strong style="color:#c9d1d9;">Vesper</strong> programme
|
||||
and download the companion app from the Google Play Store. The app gives you full
|
||||
control over your device, including scheduling, customisation, and real-time
|
||||
monitoring.
|
||||
</p>
|
||||
|
||||
<!-- CTA buttons -->
|
||||
<table cellpadding="0" cellspacing="0" width="100%" style="margin:0 0 32px;">
|
||||
<tr>
|
||||
<td align="center" style="padding-bottom:12px;">
|
||||
<a href="{_OPT_IN_URL}"
|
||||
style="display:inline-block; background-color:#238636; color:#ffffff;
|
||||
text-decoration:none; padding:14px 32px; border-radius:8px;
|
||||
font-size:14px; font-weight:700; letter-spacing:0.4px;
|
||||
border:1px solid #2ea043; width:240px; text-align:center;">
|
||||
Join the Vesper Programme
|
||||
</a>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td align="center">
|
||||
<a href="{_APP_URL}"
|
||||
style="display:inline-block; background-color:#1f6feb; color:#ffffff;
|
||||
text-decoration:none; padding:14px 32px; border-radius:8px;
|
||||
font-size:14px; font-weight:700; letter-spacing:0.4px;
|
||||
border:1px solid #388bfd; width:240px; text-align:center;">
|
||||
Download on Google Play
|
||||
</a>
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
|
||||
<!-- Device info card -->
|
||||
<table width="100%" cellpadding="0" cellspacing="0"
|
||||
style="background:#0d1117; border:1px solid #30363d; border-radius:8px; margin-bottom:28px;">
|
||||
<tr>
|
||||
<td style="padding:16px 20px; border-bottom:1px solid #21262d;">
|
||||
<span style="font-size:11px; color:#58a6ff; text-transform:uppercase;
|
||||
letter-spacing:1.2px; font-weight:700;">Device Model</span><br>
|
||||
<span style="font-size:15px; color:#c9d1d9; font-weight:600; margin-top:4px; display:block;">
|
||||
BellSystems {device_name}
|
||||
</span>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td style="padding:16px 20px;">
|
||||
<span style="font-size:11px; color:#58a6ff; text-transform:uppercase;
|
||||
letter-spacing:1.2px; font-weight:700;">Serial Number</span><br>
|
||||
<code style="font-size:14px; color:#79c0ff; background:#161b22;
|
||||
padding:4px 10px; border-radius:4px; font-family:monospace;
|
||||
border:1px solid #30363d; margin-top:6px; display:inline-block;">
|
||||
{serial_number}
|
||||
</code>
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
|
||||
<!-- How it works steps -->
|
||||
<table width="100%" cellpadding="0" cellspacing="0"
|
||||
style="background:#0d1117; border:1px solid #30363d; border-radius:8px; margin-bottom:28px;">
|
||||
<tr>
|
||||
<td style="padding:16px 20px; border-bottom:1px solid #21262d;">
|
||||
<span style="font-size:11px; color:#8b949e; text-transform:uppercase;
|
||||
letter-spacing:1.2px; font-weight:700;">Getting Started</span>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td style="padding:14px 20px; border-bottom:1px solid #21262d;">
|
||||
<span style="color:#58a6ff; font-weight:700; font-size:13px;">1 </span>
|
||||
<span style="color:#8b949e; font-size:13px; line-height:1.6;">
|
||||
Click <strong style="color:#c9d1d9;">Join the Vesper Programme</strong> above to opt in via the Google Play testing programme.
|
||||
</span>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td style="padding:14px 20px; border-bottom:1px solid #21262d;">
|
||||
<span style="color:#58a6ff; font-weight:700; font-size:13px;">2 </span>
|
||||
<span style="color:#8b949e; font-size:13px; line-height:1.6;">
|
||||
Download the <strong style="color:#c9d1d9;">Vesper</strong> app from the Google Play Store.
|
||||
</span>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td style="padding:14px 20px;">
|
||||
<span style="color:#58a6ff; font-weight:700; font-size:13px;">3 </span>
|
||||
<span style="color:#8b949e; font-size:13px; line-height:1.6;">
|
||||
Sign in with your account and your device will appear automatically.
|
||||
</span>
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
|
||||
<p style="margin:0; font-size:14px; color:#6e7681; line-height:1.7;">
|
||||
If you have any questions or need assistance with setup, our support team is
|
||||
always happy to help.
|
||||
</p>
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<!-- Footer -->
|
||||
<tr>
|
||||
<td style="background-color:#0d1117; border-top:1px solid #21262d;
|
||||
padding:24px 40px; text-align:center;">
|
||||
<p style="margin:0 0 6px; font-size:13px; color:#8b949e; font-weight:600;">
|
||||
BellSystems.gr
|
||||
</p>
|
||||
<p style="margin:0; font-size:12px; color:#6e7681;">
|
||||
Questions? Contact us at
|
||||
<a href="mailto:support@bellsystems.gr"
|
||||
style="color:#58a6ff; text-decoration:none;">support@bellsystems.gr</a>
|
||||
</p>
|
||||
<p style="margin:8px 0 0; font-size:11px; color:#484f58;">
|
||||
If you did not expect this notification, please disregard this message.
|
||||
</p>
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
</table>
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
</body>
|
||||
</html>"""
|
||||
|
||||
send_email(
|
||||
to=user_email,
|
||||
subject=f"Your BellSystems {device_name} is ready — get started now!",
|
||||
html=html,
|
||||
)
|
||||
@@ -1,155 +0,0 @@
|
||||
import logging
|
||||
import base64
|
||||
import os
|
||||
import resend
|
||||
from config import settings
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Embed logo as base64 so it works in any email client without a public URL
|
||||
_LOGO_PATH = os.path.join(os.path.dirname(__file__), "assets", "bell_systems_horizontal_darkMode.png")
|
||||
try:
|
||||
with open(_LOGO_PATH, "rb") as _f:
|
||||
_LOGO_B64 = base64.b64encode(_f.read()).decode()
|
||||
_LOGO_SRC = f"data:image/png;base64,{_LOGO_B64}"
|
||||
except Exception:
|
||||
_LOGO_SRC = "" # fallback: image won't appear but email still sends
|
||||
|
||||
|
||||
def send_email(to: str, subject: str, html: str) -> None:
|
||||
"""Send a transactional email via Resend."""
|
||||
try:
|
||||
resend.api_key = settings.resend_api_key
|
||||
resend.Emails.send({
|
||||
"from": settings.email_from,
|
||||
"to": to,
|
||||
"subject": subject,
|
||||
"html": html,
|
||||
})
|
||||
logger.info("Email sent to %s — subject: %s", to, subject)
|
||||
except Exception as exc:
|
||||
logger.error("Failed to send email to %s: %s", to, exc)
|
||||
raise
|
||||
|
||||
|
||||
def send_device_manufactured_email(
|
||||
customer_email: str,
|
||||
serial_number: str,
|
||||
device_name: str,
|
||||
customer_name: str | None = None,
|
||||
) -> None:
|
||||
"""
|
||||
Notify a customer that their BellSystems device has been manufactured
|
||||
and is being prepared for shipment.
|
||||
"""
|
||||
greeting = f"Dear {customer_name}," if customer_name else "Dear valued customer,"
|
||||
|
||||
html = f"""<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>Your BellSystems Device Has Been Manufactured</title>
|
||||
</head>
|
||||
<body style="margin:0; padding:0; background-color:#0d1117; font-family:'Helvetica Neue', Helvetica, Arial, sans-serif;">
|
||||
<table width="100%" cellpadding="0" cellspacing="0" style="background-color:#0d1117; padding:40px 16px;">
|
||||
<tr>
|
||||
<td align="center">
|
||||
<table width="580" cellpadding="0" cellspacing="0"
|
||||
style="background-color:#161b22; border-radius:12px; overflow:hidden;
|
||||
box-shadow:0 4px 24px rgba(0,0,0,0.5); max-width:580px; width:100%;
|
||||
border:1px solid #30363d;">
|
||||
|
||||
<!-- Header with logo -->
|
||||
<tr>
|
||||
<td style="background-color:#0f172a; padding:32px 40px 28px; text-align:center;
|
||||
border-bottom:1px solid #21262d;">
|
||||
{"<img src='" + _LOGO_SRC + "' alt='BellSystems' width='180' style='display:block; margin:0 auto; max-width:180px;'>" if _LOGO_SRC else "<h1 style='color:#ffffff; margin:0; font-size:22px; font-weight:700; letter-spacing:1px;'>BELLSYSTEMS</h1>"}
|
||||
<p style="color:#64748b; margin:14px 0 0; font-size:11px; letter-spacing:2.5px;
|
||||
text-transform:uppercase; font-weight:600;">Manufacturing Update</p>
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<!-- Body -->
|
||||
<tr>
|
||||
<td style="padding:36px 40px 28px;">
|
||||
|
||||
<p style="margin:0 0 24px; font-size:16px; color:#c9d1d9; font-weight:500;">
|
||||
{greeting}
|
||||
</p>
|
||||
|
||||
<p style="margin:0 0 18px; font-size:15px; color:#8b949e; line-height:1.75;">
|
||||
We are pleased to inform you that your
|
||||
<strong style="color:#c9d1d9;">BellSystems {device_name}</strong>
|
||||
has been successfully manufactured and has passed all quality checks.
|
||||
</p>
|
||||
|
||||
<p style="margin:0 0 28px; font-size:15px; color:#8b949e; line-height:1.75;">
|
||||
Your device is now being prepared for delivery. You will receive a separate
|
||||
notification with tracking information once it has been dispatched.
|
||||
</p>
|
||||
|
||||
<!-- Device info card -->
|
||||
<table width="100%" cellpadding="0" cellspacing="0"
|
||||
style="background:#0d1117; border:1px solid #30363d; border-radius:8px; margin-bottom:32px;">
|
||||
<tr>
|
||||
<td style="padding:16px 20px; border-bottom:1px solid #21262d;">
|
||||
<span style="font-size:11px; color:#58a6ff; text-transform:uppercase;
|
||||
letter-spacing:1.2px; font-weight:700;">Device Model</span><br>
|
||||
<span style="font-size:15px; color:#c9d1d9; font-weight:600; margin-top:4px; display:block;">
|
||||
BellSystems {device_name}
|
||||
</span>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td style="padding:16px 20px;">
|
||||
<span style="font-size:11px; color:#58a6ff; text-transform:uppercase;
|
||||
letter-spacing:1.2px; font-weight:700;">Serial Number</span><br>
|
||||
<code style="font-size:14px; color:#79c0ff; background:#161b22;
|
||||
padding:4px 10px; border-radius:4px; font-family:monospace;
|
||||
border:1px solid #30363d; margin-top:6px; display:inline-block;">
|
||||
{serial_number}
|
||||
</code>
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
|
||||
<p style="margin:0 0 8px; font-size:14px; color:#6e7681; line-height:1.7;">
|
||||
Thank you for choosing BellSystems. We take great pride in crafting each device
|
||||
with care and precision, and we look forward to delivering an exceptional
|
||||
experience to you.
|
||||
</p>
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<!-- Footer -->
|
||||
<tr>
|
||||
<td style="background-color:#0d1117; border-top:1px solid #21262d;
|
||||
padding:24px 40px; text-align:center;">
|
||||
<p style="margin:0 0 6px; font-size:13px; color:#8b949e; font-weight:600;">
|
||||
BellSystems.gr
|
||||
</p>
|
||||
<p style="margin:0; font-size:12px; color:#6e7681;">
|
||||
Questions? Contact us at
|
||||
<a href="mailto:support@bellsystems.gr"
|
||||
style="color:#58a6ff; text-decoration:none;">support@bellsystems.gr</a>
|
||||
</p>
|
||||
<p style="margin:8px 0 0; font-size:11px; color:#484f58;">
|
||||
If you did not expect this notification, please disregard this message.
|
||||
</p>
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
</table>
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
</body>
|
||||
</html>"""
|
||||
|
||||
send_email(
|
||||
to=customer_email,
|
||||
subject=f"Your BellSystems {device_name} has been manufactured",
|
||||
html=html,
|
||||
)
|
||||
@@ -177,16 +177,12 @@ def _build_page(entries: List[bytes], slot_counts: List[int], seq: int = 0) -> b
|
||||
return page
|
||||
|
||||
|
||||
def generate(serial_number: str, hw_family: str, hw_revision: str, legacy: bool = False) -> bytes:
|
||||
def generate(serial_number: str, hw_type: str, hw_version: str) -> bytes:
|
||||
"""Generate a 0x5000-byte NVS partition binary for a Vesper device.
|
||||
|
||||
serial_number: full SN string e.g. 'BSVSPR-26C13X-STD01R-X7KQA'
|
||||
hw_family: board family e.g. 'vesper-standard', 'vesper-plus'
|
||||
hw_revision: hardware revision string e.g. '1.0'
|
||||
legacy: if True, writes old key names expected by legacy firmware (pre-new-schema):
|
||||
device_uid, hw_type, hw_version
|
||||
if False (default), writes new key names:
|
||||
serial, hw_family, hw_revision
|
||||
serial_number: full SN string e.g. 'PV-26B27-VS01R-X7KQA'
|
||||
hw_type: board type e.g. 'vesper', 'vesper_plus', 'vesper_pro'
|
||||
hw_version: zero-padded version e.g. '01'
|
||||
|
||||
Returns raw bytes ready to flash at 0x9000.
|
||||
"""
|
||||
@@ -194,14 +190,9 @@ def generate(serial_number: str, hw_family: str, hw_revision: str, legacy: bool
|
||||
|
||||
# Build entries for namespace "device_id"
|
||||
ns_entry, ns_span = _build_namespace_entry("device_id", ns_index)
|
||||
if legacy:
|
||||
uid_entry, uid_span = _build_string_entry(ns_index, "device_uid", serial_number)
|
||||
hwt_entry, hwt_span = _build_string_entry(ns_index, "hw_type", hw_family.lower())
|
||||
hwv_entry, hwv_span = _build_string_entry(ns_index, "hw_version", hw_revision)
|
||||
else:
|
||||
uid_entry, uid_span = _build_string_entry(ns_index, "serial", serial_number)
|
||||
hwt_entry, hwt_span = _build_string_entry(ns_index, "hw_family", hw_family.lower())
|
||||
hwv_entry, hwv_span = _build_string_entry(ns_index, "hw_revision", hw_revision)
|
||||
hwt_entry, hwt_span = _build_string_entry(ns_index, "hw_type", hw_type.lower())
|
||||
hwv_entry, hwv_span = _build_string_entry(ns_index, "hw_version", hw_version)
|
||||
|
||||
entries = [ns_entry, uid_entry, hwt_entry, hwv_entry]
|
||||
spans = [ns_span, uid_span, hwt_span, hwv_span]
|
||||
|
||||
@@ -4,75 +4,17 @@ from datetime import datetime
|
||||
MONTH_CODES = "ABCDEFGHIJKL"
|
||||
SAFE_CHARS = "ABCDEFGHJKLMNPQRSTUVWXYZ23456789" # No 0, O, 1, I — avoids label confusion
|
||||
|
||||
# Family segment (chars 3-6 of segment 1, after "BS")
|
||||
BOARD_FAMILY_CODES = {
|
||||
"vesper": "VSPR",
|
||||
"vesper_plus": "VSPR",
|
||||
"vesper_pro": "VSPR",
|
||||
"agnus": "AGNS",
|
||||
"agnus_mini": "AGNS",
|
||||
"chronos": "CRNS",
|
||||
"chronos_pro": "CRNS",
|
||||
}
|
||||
|
||||
# Variant segment (first 3 chars of segment 3)
|
||||
BOARD_VARIANT_CODES = {
|
||||
"vesper": "STD",
|
||||
"vesper_plus": "PLS",
|
||||
"vesper_pro": "PRO",
|
||||
"agnus": "STD",
|
||||
"agnus_mini": "MIN",
|
||||
"chronos": "STD",
|
||||
"chronos_pro": "PRO",
|
||||
}
|
||||
|
||||
|
||||
def _version_suffix(board_version: str) -> str:
|
||||
"""Convert version string to 3-char suffix.
|
||||
|
||||
Rules:
|
||||
- Strip the dot: "2.3" → "23", "10.2" → "102"
|
||||
- If result is 2 digits, append "R": "23" → "23R"
|
||||
- If result is already 3 digits, use as-is: "102" → "102"
|
||||
"""
|
||||
digits = board_version.replace(".", "")
|
||||
if len(digits) >= 3:
|
||||
return digits[:3]
|
||||
return digits.ljust(2, "0") + "R"
|
||||
|
||||
|
||||
def generate_serial(board_type: str, board_version: str) -> str:
|
||||
"""Generate a serial number in the format BSFFFF-YYMDDFX-VVVHHH-XXXXXX.
|
||||
"""Generate a serial number in the format PV-YYMMM-BBTTR-XXXXX.
|
||||
|
||||
Format: BSFFFF-YYMDDf-VVVvvv-XXXXXX
|
||||
BS = Bell Systems (static)
|
||||
FFFF = 4-char family code (VSPR, AGNS, CRNS)
|
||||
YY = 2-digit year
|
||||
M = month code A-L
|
||||
DD = 2-digit day
|
||||
f = random filler char
|
||||
VVV = 3-char variant (STD, PLS, PRO, MIN)
|
||||
vvv = 3-char version suffix (e.g. 23R, 102)
|
||||
XXXXXX = 6-char random suffix
|
||||
|
||||
board_type: enum value e.g. 'vesper', 'vesper_plus', 'vesper_pro'
|
||||
board_version: version string e.g. '2.3', '10.2'
|
||||
board_type: 2-char uppercase code, e.g. 'VS', 'VP', 'VX'
|
||||
board_version: 2-char zero-padded version, e.g. '01'
|
||||
"""
|
||||
key = board_type.lower()
|
||||
family = BOARD_FAMILY_CODES.get(key, "UNKN")
|
||||
variant = BOARD_VARIANT_CODES.get(key, "UNK")
|
||||
ver = _version_suffix(board_version)
|
||||
|
||||
now = datetime.utcnow()
|
||||
year = now.strftime("%y")
|
||||
month = MONTH_CODES[now.month - 1]
|
||||
day = now.strftime("%d")
|
||||
filler = random.choice(SAFE_CHARS)
|
||||
suffix = "".join(random.choices(SAFE_CHARS, k=6))
|
||||
|
||||
seg1 = f"BS{family}" # e.g. BSVSPR
|
||||
seg2 = f"{year}{month}{day}{filler}" # e.g. 26C13X
|
||||
seg3 = f"{variant}{ver}" # e.g. PRO23R
|
||||
seg4 = suffix # e.g. X9K4M2
|
||||
|
||||
return f"{seg1}-{seg2}-{seg3}-{seg4}"
|
||||
suffix = "".join(random.choices(SAFE_CHARS, k=5))
|
||||
version_clean = board_version.replace(".", "")
|
||||
return f"PV-{year}{month}{day}-{board_type.upper()}{version_clean}R-{suffix}"
|
||||
|
||||
@@ -6,10 +6,9 @@ services:
|
||||
volumes:
|
||||
- ./backend:/app
|
||||
# Persistent data - lives outside the container
|
||||
- ./data:/app/data
|
||||
- ./data/mqtt_data.db:/app/mqtt_data.db
|
||||
- ./data/built_melodies:/app/storage/built_melodies
|
||||
- ./data/firmware:/app/storage/firmware
|
||||
- ./data/flash_assets:/app/storage/flash_assets
|
||||
- ./data/firebase-service-account.json:/app/firebase-service-account.json:ro
|
||||
# Auto-deploy: project root so container can write the trigger file
|
||||
- /home/bellsystems/bellsystems-cp:/home/bellsystems/bellsystems-cp
|
||||
|
||||
704
frontend/package-lock.json
generated
@@ -10,7 +10,6 @@
|
||||
"dependencies": {
|
||||
"esptool-js": "^0.5.7",
|
||||
"leaflet": "^1.9.4",
|
||||
"qrcode": "^1.5.4",
|
||||
"react": "^19.2.0",
|
||||
"react-dom": "^19.2.0",
|
||||
"react-leaflet": "^5.0.0",
|
||||
@@ -27,8 +26,7 @@
|
||||
"eslint-plugin-react-refresh": "^0.4.24",
|
||||
"globals": "^16.5.0",
|
||||
"tailwindcss": "^4.1.18",
|
||||
"vite": "^7.3.1",
|
||||
"vite-plugin-svgr": "^4.5.0"
|
||||
"vite": "^7.3.1"
|
||||
}
|
||||
},
|
||||
"node_modules/@babel/code-frame": {
|
||||
@@ -1032,29 +1030,6 @@
|
||||
"dev": true,
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/@rollup/pluginutils": {
|
||||
"version": "5.3.0",
|
||||
"resolved": "https://registry.npmjs.org/@rollup/pluginutils/-/pluginutils-5.3.0.tgz",
|
||||
"integrity": "sha512-5EdhGZtnu3V88ces7s53hhfK5KSASnJZv8Lulpc04cWO3REESroJXg73DFsOmgbU2BhwV0E20bu2IDZb3VKW4Q==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@types/estree": "^1.0.0",
|
||||
"estree-walker": "^2.0.2",
|
||||
"picomatch": "^4.0.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=14.0.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"rollup": "^1.20.0||^2.0.0||^3.0.0||^4.0.0"
|
||||
},
|
||||
"peerDependenciesMeta": {
|
||||
"rollup": {
|
||||
"optional": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"node_modules/@rollup/rollup-android-arm-eabi": {
|
||||
"version": "4.57.1",
|
||||
"resolved": "https://registry.npmjs.org/@rollup/rollup-android-arm-eabi/-/rollup-android-arm-eabi-4.57.1.tgz",
|
||||
@@ -1405,231 +1380,6 @@
|
||||
"win32"
|
||||
]
|
||||
},
|
||||
"node_modules/@svgr/babel-plugin-add-jsx-attribute": {
|
||||
"version": "8.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@svgr/babel-plugin-add-jsx-attribute/-/babel-plugin-add-jsx-attribute-8.0.0.tgz",
|
||||
"integrity": "sha512-b9MIk7yhdS1pMCZM8VeNfUlSKVRhsHZNMl5O9SfaX0l0t5wjdgu4IDzGB8bpnGBBOjGST3rRFVsaaEtI4W6f7g==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=14"
|
||||
},
|
||||
"funding": {
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/gregberge"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@babel/core": "^7.0.0-0"
|
||||
}
|
||||
},
|
||||
"node_modules/@svgr/babel-plugin-remove-jsx-attribute": {
|
||||
"version": "8.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@svgr/babel-plugin-remove-jsx-attribute/-/babel-plugin-remove-jsx-attribute-8.0.0.tgz",
|
||||
"integrity": "sha512-BcCkm/STipKvbCl6b7QFrMh/vx00vIP63k2eM66MfHJzPr6O2U0jYEViXkHJWqXqQYjdeA9cuCl5KWmlwjDvbA==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=14"
|
||||
},
|
||||
"funding": {
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/gregberge"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@babel/core": "^7.0.0-0"
|
||||
}
|
||||
},
|
||||
"node_modules/@svgr/babel-plugin-remove-jsx-empty-expression": {
|
||||
"version": "8.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@svgr/babel-plugin-remove-jsx-empty-expression/-/babel-plugin-remove-jsx-empty-expression-8.0.0.tgz",
|
||||
"integrity": "sha512-5BcGCBfBxB5+XSDSWnhTThfI9jcO5f0Ai2V24gZpG+wXF14BzwxxdDb4g6trdOux0rhibGs385BeFMSmxtS3uA==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=14"
|
||||
},
|
||||
"funding": {
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/gregberge"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@babel/core": "^7.0.0-0"
|
||||
}
|
||||
},
|
||||
"node_modules/@svgr/babel-plugin-replace-jsx-attribute-value": {
|
||||
"version": "8.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@svgr/babel-plugin-replace-jsx-attribute-value/-/babel-plugin-replace-jsx-attribute-value-8.0.0.tgz",
|
||||
"integrity": "sha512-KVQ+PtIjb1BuYT3ht8M5KbzWBhdAjjUPdlMtpuw/VjT8coTrItWX6Qafl9+ji831JaJcu6PJNKCV0bp01lBNzQ==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=14"
|
||||
},
|
||||
"funding": {
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/gregberge"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@babel/core": "^7.0.0-0"
|
||||
}
|
||||
},
|
||||
"node_modules/@svgr/babel-plugin-svg-dynamic-title": {
|
||||
"version": "8.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@svgr/babel-plugin-svg-dynamic-title/-/babel-plugin-svg-dynamic-title-8.0.0.tgz",
|
||||
"integrity": "sha512-omNiKqwjNmOQJ2v6ge4SErBbkooV2aAWwaPFs2vUY7p7GhVkzRkJ00kILXQvRhA6miHnNpXv7MRnnSjdRjK8og==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=14"
|
||||
},
|
||||
"funding": {
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/gregberge"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@babel/core": "^7.0.0-0"
|
||||
}
|
||||
},
|
||||
"node_modules/@svgr/babel-plugin-svg-em-dimensions": {
|
||||
"version": "8.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@svgr/babel-plugin-svg-em-dimensions/-/babel-plugin-svg-em-dimensions-8.0.0.tgz",
|
||||
"integrity": "sha512-mURHYnu6Iw3UBTbhGwE/vsngtCIbHE43xCRK7kCw4t01xyGqb2Pd+WXekRRoFOBIY29ZoOhUCTEweDMdrjfi9g==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=14"
|
||||
},
|
||||
"funding": {
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/gregberge"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@babel/core": "^7.0.0-0"
|
||||
}
|
||||
},
|
||||
"node_modules/@svgr/babel-plugin-transform-react-native-svg": {
|
||||
"version": "8.1.0",
|
||||
"resolved": "https://registry.npmjs.org/@svgr/babel-plugin-transform-react-native-svg/-/babel-plugin-transform-react-native-svg-8.1.0.tgz",
|
||||
"integrity": "sha512-Tx8T58CHo+7nwJ+EhUwx3LfdNSG9R2OKfaIXXs5soiy5HtgoAEkDay9LIimLOcG8dJQH1wPZp/cnAv6S9CrR1Q==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=14"
|
||||
},
|
||||
"funding": {
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/gregberge"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@babel/core": "^7.0.0-0"
|
||||
}
|
||||
},
|
||||
"node_modules/@svgr/babel-plugin-transform-svg-component": {
|
||||
"version": "8.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@svgr/babel-plugin-transform-svg-component/-/babel-plugin-transform-svg-component-8.0.0.tgz",
|
||||
"integrity": "sha512-DFx8xa3cZXTdb/k3kfPeaixecQLgKh5NVBMwD0AQxOzcZawK4oo1Jh9LbrcACUivsCA7TLG8eeWgrDXjTMhRmw==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
},
|
||||
"funding": {
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/gregberge"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@babel/core": "^7.0.0-0"
|
||||
}
|
||||
},
|
||||
"node_modules/@svgr/babel-preset": {
|
||||
"version": "8.1.0",
|
||||
"resolved": "https://registry.npmjs.org/@svgr/babel-preset/-/babel-preset-8.1.0.tgz",
|
||||
"integrity": "sha512-7EYDbHE7MxHpv4sxvnVPngw5fuR6pw79SkcrILHJ/iMpuKySNCl5W1qcwPEpU+LgyRXOaAFgH0KhwD18wwg6ug==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@svgr/babel-plugin-add-jsx-attribute": "8.0.0",
|
||||
"@svgr/babel-plugin-remove-jsx-attribute": "8.0.0",
|
||||
"@svgr/babel-plugin-remove-jsx-empty-expression": "8.0.0",
|
||||
"@svgr/babel-plugin-replace-jsx-attribute-value": "8.0.0",
|
||||
"@svgr/babel-plugin-svg-dynamic-title": "8.0.0",
|
||||
"@svgr/babel-plugin-svg-em-dimensions": "8.0.0",
|
||||
"@svgr/babel-plugin-transform-react-native-svg": "8.1.0",
|
||||
"@svgr/babel-plugin-transform-svg-component": "8.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=14"
|
||||
},
|
||||
"funding": {
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/gregberge"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@babel/core": "^7.0.0-0"
|
||||
}
|
||||
},
|
||||
"node_modules/@svgr/core": {
|
||||
"version": "8.1.0",
|
||||
"resolved": "https://registry.npmjs.org/@svgr/core/-/core-8.1.0.tgz",
|
||||
"integrity": "sha512-8QqtOQT5ACVlmsvKOJNEaWmRPmcojMOzCz4Hs2BGG/toAp/K38LcsMRyLp349glq5AzJbCEeimEoxaX6v/fLrA==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@babel/core": "^7.21.3",
|
||||
"@svgr/babel-preset": "8.1.0",
|
||||
"camelcase": "^6.2.0",
|
||||
"cosmiconfig": "^8.1.3",
|
||||
"snake-case": "^3.0.4"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=14"
|
||||
},
|
||||
"funding": {
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/gregberge"
|
||||
}
|
||||
},
|
||||
"node_modules/@svgr/hast-util-to-babel-ast": {
|
||||
"version": "8.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@svgr/hast-util-to-babel-ast/-/hast-util-to-babel-ast-8.0.0.tgz",
|
||||
"integrity": "sha512-EbDKwO9GpfWP4jN9sGdYwPBU0kdomaPIL2Eu4YwmgP+sJeXT+L7bMwJUBnhzfH8Q2qMBqZ4fJwpCyYsAN3mt2Q==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@babel/types": "^7.21.3",
|
||||
"entities": "^4.4.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=14"
|
||||
},
|
||||
"funding": {
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/gregberge"
|
||||
}
|
||||
},
|
||||
"node_modules/@svgr/plugin-jsx": {
|
||||
"version": "8.1.0",
|
||||
"resolved": "https://registry.npmjs.org/@svgr/plugin-jsx/-/plugin-jsx-8.1.0.tgz",
|
||||
"integrity": "sha512-0xiIyBsLlr8quN+WyuxooNW9RJ0Dpr8uOnH/xrCVO8GLUcwHISwj1AG0k+LFzteTkAA0GbX0kj9q6Dk70PTiPA==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@babel/core": "^7.21.3",
|
||||
"@svgr/babel-preset": "8.1.0",
|
||||
"@svgr/hast-util-to-babel-ast": "8.0.0",
|
||||
"svg-parser": "^2.0.4"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=14"
|
||||
},
|
||||
"funding": {
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/gregberge"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@svgr/core": "*"
|
||||
}
|
||||
},
|
||||
"node_modules/@tailwindcss/node": {
|
||||
"version": "4.1.18",
|
||||
"resolved": "https://registry.npmjs.org/@tailwindcss/node/-/node-4.1.18.tgz",
|
||||
@@ -2042,19 +1792,11 @@
|
||||
"url": "https://github.com/sponsors/epoberezkin"
|
||||
}
|
||||
},
|
||||
"node_modules/ansi-regex": {
|
||||
"version": "5.0.1",
|
||||
"resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.1.tgz",
|
||||
"integrity": "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/ansi-styles": {
|
||||
"version": "4.3.0",
|
||||
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz",
|
||||
"integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"color-convert": "^2.0.1"
|
||||
@@ -2151,19 +1893,6 @@
|
||||
"node": ">=6"
|
||||
}
|
||||
},
|
||||
"node_modules/camelcase": {
|
||||
"version": "6.3.0",
|
||||
"resolved": "https://registry.npmjs.org/camelcase/-/camelcase-6.3.0.tgz",
|
||||
"integrity": "sha512-Gmy6FhYlCY7uOElZUSbxo2UCDH8owEk996gkbrpsgGtrJLM3J7jGxl9Ic7Qwwj4ivOE5AWZWRMecDdF7hqGjFA==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/caniuse-lite": {
|
||||
"version": "1.0.30001770",
|
||||
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001770.tgz",
|
||||
@@ -2202,21 +1931,11 @@
|
||||
"url": "https://github.com/chalk/chalk?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/cliui": {
|
||||
"version": "6.0.0",
|
||||
"resolved": "https://registry.npmjs.org/cliui/-/cliui-6.0.0.tgz",
|
||||
"integrity": "sha512-t6wbgtoCXvAzst7QgXxJYqPt0usEfbgQdftEPbLL/cvv6HPE5VgvqCuAIDR0NgU52ds6rFwqrgakNLrHEjCbrQ==",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"string-width": "^4.2.0",
|
||||
"strip-ansi": "^6.0.0",
|
||||
"wrap-ansi": "^6.2.0"
|
||||
}
|
||||
},
|
||||
"node_modules/color-convert": {
|
||||
"version": "2.0.1",
|
||||
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
|
||||
"integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"color-name": "~1.1.4"
|
||||
@@ -2229,6 +1948,7 @@
|
||||
"version": "1.1.4",
|
||||
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
|
||||
"integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==",
|
||||
"dev": true,
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/concat-map": {
|
||||
@@ -2258,33 +1978,6 @@
|
||||
"url": "https://opencollective.com/express"
|
||||
}
|
||||
},
|
||||
"node_modules/cosmiconfig": {
|
||||
"version": "8.3.6",
|
||||
"resolved": "https://registry.npmjs.org/cosmiconfig/-/cosmiconfig-8.3.6.tgz",
|
||||
"integrity": "sha512-kcZ6+W5QzcJ3P1Mt+83OUv/oHFqZHIx8DuxG6eZ5RGMERoLqp4BuGjhHLYGK+Kf5XVkQvqBSmAy/nGWN3qDgEA==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"import-fresh": "^3.3.0",
|
||||
"js-yaml": "^4.1.0",
|
||||
"parse-json": "^5.2.0",
|
||||
"path-type": "^4.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=14"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/d-fischer"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"typescript": ">=4.9.5"
|
||||
},
|
||||
"peerDependenciesMeta": {
|
||||
"typescript": {
|
||||
"optional": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"node_modules/cross-spawn": {
|
||||
"version": "7.0.6",
|
||||
"resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz",
|
||||
@@ -2325,15 +2018,6 @@
|
||||
}
|
||||
}
|
||||
},
|
||||
"node_modules/decamelize": {
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/decamelize/-/decamelize-1.2.0.tgz",
|
||||
"integrity": "sha512-z2S+W9X73hAUUki+N+9Za2lBlun89zigOyGrsax+KUQ6wKW4ZoWpEYBkGhQjwAjjDCkWxhY0VKEhk8wzY7F5cA==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=0.10.0"
|
||||
}
|
||||
},
|
||||
"node_modules/deep-is": {
|
||||
"version": "0.1.4",
|
||||
"resolved": "https://registry.npmjs.org/deep-is/-/deep-is-0.1.4.tgz",
|
||||
@@ -2351,23 +2035,6 @@
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/dijkstrajs": {
|
||||
"version": "1.0.3",
|
||||
"resolved": "https://registry.npmjs.org/dijkstrajs/-/dijkstrajs-1.0.3.tgz",
|
||||
"integrity": "sha512-qiSlmBq9+BCdCA/L46dw8Uy93mloxsPSbwnm5yrKn2vMPiy8KyAskTF6zuV/j5BMsmOGZDPs7KjU+mjb670kfA==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/dot-case": {
|
||||
"version": "3.0.4",
|
||||
"resolved": "https://registry.npmjs.org/dot-case/-/dot-case-3.0.4.tgz",
|
||||
"integrity": "sha512-Kv5nKlh6yRrdrGvxeJ2e5y2eRUpkUosIW4A2AS38zwSz27zu7ufDwQPi5Jhs3XAlGNetl3bmnGhQsMtkKJnj3w==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"no-case": "^3.0.4",
|
||||
"tslib": "^2.0.3"
|
||||
}
|
||||
},
|
||||
"node_modules/electron-to-chromium": {
|
||||
"version": "1.5.286",
|
||||
"resolved": "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.5.286.tgz",
|
||||
@@ -2375,12 +2042,6 @@
|
||||
"dev": true,
|
||||
"license": "ISC"
|
||||
},
|
||||
"node_modules/emoji-regex": {
|
||||
"version": "8.0.0",
|
||||
"resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
|
||||
"integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/enhanced-resolve": {
|
||||
"version": "5.19.0",
|
||||
"resolved": "https://registry.npmjs.org/enhanced-resolve/-/enhanced-resolve-5.19.0.tgz",
|
||||
@@ -2395,29 +2056,6 @@
|
||||
"node": ">=10.13.0"
|
||||
}
|
||||
},
|
||||
"node_modules/entities": {
|
||||
"version": "4.5.0",
|
||||
"resolved": "https://registry.npmjs.org/entities/-/entities-4.5.0.tgz",
|
||||
"integrity": "sha512-V0hjH4dGPh9Ao5p0MoRY6BVqtwCjhz6vI5LT8AJ55H+4g9/4vbHx1I54fS0XuclLhDHArPQCiMjDxjaL8fPxhw==",
|
||||
"dev": true,
|
||||
"license": "BSD-2-Clause",
|
||||
"engines": {
|
||||
"node": ">=0.12"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/fb55/entities?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/error-ex": {
|
||||
"version": "1.3.4",
|
||||
"resolved": "https://registry.npmjs.org/error-ex/-/error-ex-1.3.4.tgz",
|
||||
"integrity": "sha512-sqQamAnR14VgCr1A618A3sGrygcpK+HEbenA/HiEAkkUwcZIIB/tgWqHFxWgOyDh4nB4JCRimh79dR5Ywc9MDQ==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"is-arrayish": "^0.2.1"
|
||||
}
|
||||
},
|
||||
"node_modules/esbuild": {
|
||||
"version": "0.27.3",
|
||||
"resolved": "https://registry.npmjs.org/esbuild/-/esbuild-0.27.3.tgz",
|
||||
@@ -2668,13 +2306,6 @@
|
||||
"node": ">=4.0"
|
||||
}
|
||||
},
|
||||
"node_modules/estree-walker": {
|
||||
"version": "2.0.2",
|
||||
"resolved": "https://registry.npmjs.org/estree-walker/-/estree-walker-2.0.2.tgz",
|
||||
"integrity": "sha512-Rfkk/Mp/DL7JVje3u18FxFujQlTNR2q6QfMSMB7AvCBx91NGj/ba3kCfza0f6dVDbw7YlRf/nDrn7pQrCCyQ/w==",
|
||||
"dev": true,
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/esutils": {
|
||||
"version": "2.0.3",
|
||||
"resolved": "https://registry.npmjs.org/esutils/-/esutils-2.0.3.tgz",
|
||||
@@ -2800,15 +2431,6 @@
|
||||
"node": ">=6.9.0"
|
||||
}
|
||||
},
|
||||
"node_modules/get-caller-file": {
|
||||
"version": "2.0.5",
|
||||
"resolved": "https://registry.npmjs.org/get-caller-file/-/get-caller-file-2.0.5.tgz",
|
||||
"integrity": "sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg==",
|
||||
"license": "ISC",
|
||||
"engines": {
|
||||
"node": "6.* || 8.* || >= 10.*"
|
||||
}
|
||||
},
|
||||
"node_modules/glob-parent": {
|
||||
"version": "6.0.2",
|
||||
"resolved": "https://registry.npmjs.org/glob-parent/-/glob-parent-6.0.2.tgz",
|
||||
@@ -2906,13 +2528,6 @@
|
||||
"node": ">=0.8.19"
|
||||
}
|
||||
},
|
||||
"node_modules/is-arrayish": {
|
||||
"version": "0.2.1",
|
||||
"resolved": "https://registry.npmjs.org/is-arrayish/-/is-arrayish-0.2.1.tgz",
|
||||
"integrity": "sha512-zz06S8t0ozoDXMG+ube26zeCTNXcKIPJZJi8hBrF4idCLms4CG9QtK7qBl1boi5ODzFpjswb5JPmHCbMpjaYzg==",
|
||||
"dev": true,
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/is-extglob": {
|
||||
"version": "2.1.1",
|
||||
"resolved": "https://registry.npmjs.org/is-extglob/-/is-extglob-2.1.1.tgz",
|
||||
@@ -2923,15 +2538,6 @@
|
||||
"node": ">=0.10.0"
|
||||
}
|
||||
},
|
||||
"node_modules/is-fullwidth-code-point": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz",
|
||||
"integrity": "sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/is-glob": {
|
||||
"version": "4.0.3",
|
||||
"resolved": "https://registry.npmjs.org/is-glob/-/is-glob-4.0.3.tgz",
|
||||
@@ -3002,13 +2608,6 @@
|
||||
"dev": true,
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/json-parse-even-better-errors": {
|
||||
"version": "2.3.1",
|
||||
"resolved": "https://registry.npmjs.org/json-parse-even-better-errors/-/json-parse-even-better-errors-2.3.1.tgz",
|
||||
"integrity": "sha512-xyFwyhro/JEof6Ghe2iz2NcXoj2sloNsWr/XsERDK/oiPCfaNhl5ONfp+jQdAZRQQ0IJWNzH9zIZF7li91kh2w==",
|
||||
"dev": true,
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/json-schema-traverse": {
|
||||
"version": "0.4.1",
|
||||
"resolved": "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-0.4.1.tgz",
|
||||
@@ -3327,13 +2926,6 @@
|
||||
"url": "https://opencollective.com/parcel"
|
||||
}
|
||||
},
|
||||
"node_modules/lines-and-columns": {
|
||||
"version": "1.2.4",
|
||||
"resolved": "https://registry.npmjs.org/lines-and-columns/-/lines-and-columns-1.2.4.tgz",
|
||||
"integrity": "sha512-7ylylesZQ/PV29jhEDl3Ufjo6ZX7gCqJr5F7PKrqc93v7fzSymt1BpwEU8nAUXs8qzzvqhbjhK5QZg6Mt/HkBg==",
|
||||
"dev": true,
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/locate-path": {
|
||||
"version": "6.0.0",
|
||||
"resolved": "https://registry.npmjs.org/locate-path/-/locate-path-6.0.0.tgz",
|
||||
@@ -3357,16 +2949,6 @@
|
||||
"dev": true,
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/lower-case": {
|
||||
"version": "2.0.2",
|
||||
"resolved": "https://registry.npmjs.org/lower-case/-/lower-case-2.0.2.tgz",
|
||||
"integrity": "sha512-7fm3l3NAF9WfN6W3JOmf5drwpVqX78JtoGJ3A6W0a6ZnldM41w2fV5D490psKFTpMds8TJse/eHLFFsNHHjHgg==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"tslib": "^2.0.3"
|
||||
}
|
||||
},
|
||||
"node_modules/lru-cache": {
|
||||
"version": "5.1.1",
|
||||
"resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-5.1.1.tgz",
|
||||
@@ -3433,17 +3015,6 @@
|
||||
"dev": true,
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/no-case": {
|
||||
"version": "3.0.4",
|
||||
"resolved": "https://registry.npmjs.org/no-case/-/no-case-3.0.4.tgz",
|
||||
"integrity": "sha512-fgAN3jGAh+RoxUGZHTSOLJIqUc2wmoBwGR4tbpNAKmmovFoWq0OdRkb0VkldReO2a2iBT/OEulG9XSUc10r3zg==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"lower-case": "^2.0.2",
|
||||
"tslib": "^2.0.3"
|
||||
}
|
||||
},
|
||||
"node_modules/node-releases": {
|
||||
"version": "2.0.27",
|
||||
"resolved": "https://registry.npmjs.org/node-releases/-/node-releases-2.0.27.tgz",
|
||||
@@ -3501,15 +3072,6 @@
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/p-try": {
|
||||
"version": "2.2.0",
|
||||
"resolved": "https://registry.npmjs.org/p-try/-/p-try-2.2.0.tgz",
|
||||
"integrity": "sha512-R4nPAVTAU0B9D35/Gk3uJf/7XYbQcyohSKdvAxIRSNghFl4e71hVoGnBNQz9cWaXxO2I10KTC+3jMdvvoKw6dQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=6"
|
||||
}
|
||||
},
|
||||
"node_modules/pako": {
|
||||
"version": "2.1.0",
|
||||
"resolved": "https://registry.npmjs.org/pako/-/pako-2.1.0.tgz",
|
||||
@@ -3529,29 +3091,11 @@
|
||||
"node": ">=6"
|
||||
}
|
||||
},
|
||||
"node_modules/parse-json": {
|
||||
"version": "5.2.0",
|
||||
"resolved": "https://registry.npmjs.org/parse-json/-/parse-json-5.2.0.tgz",
|
||||
"integrity": "sha512-ayCKvm/phCGxOkYRSCM82iDwct8/EonSEgCSxWxD7ve6jHggsFl4fZVQBPRNgQoKiuV/odhFrGzQXZwbifC8Rg==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@babel/code-frame": "^7.0.0",
|
||||
"error-ex": "^1.3.1",
|
||||
"json-parse-even-better-errors": "^2.3.0",
|
||||
"lines-and-columns": "^1.1.6"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/path-exists": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/path-exists/-/path-exists-4.0.0.tgz",
|
||||
"integrity": "sha512-ak9Qy5Q7jYb2Wwcey5Fpvg2KoAc/ZIhLSLOSBmRmygPsGwkVVt0fZa0qrtMz+m6tJTAHfZQ8FnmB4MG4LWy7/w==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
@@ -3567,16 +3111,6 @@
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/path-type": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/path-type/-/path-type-4.0.0.tgz",
|
||||
"integrity": "sha512-gDKb8aZMDeD/tZWs9P6+q0J9Mwkdl6xMV8TjnGP3qJVJ06bdMgkbBlLU8IdfOsIsFz2BW1rNVT3XuNEl8zPAvw==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/picocolors": {
|
||||
"version": "1.1.1",
|
||||
"resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.1.1.tgz",
|
||||
@@ -3597,15 +3131,6 @@
|
||||
"url": "https://github.com/sponsors/jonschlinkert"
|
||||
}
|
||||
},
|
||||
"node_modules/pngjs": {
|
||||
"version": "5.0.0",
|
||||
"resolved": "https://registry.npmjs.org/pngjs/-/pngjs-5.0.0.tgz",
|
||||
"integrity": "sha512-40QW5YalBNfQo5yRYmiw7Yz6TKKVr3h6970B2YE+3fQpsWcrbj1PzJgxeJ19DRQjhMbKPIuMY8rFaXc8moolVw==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=10.13.0"
|
||||
}
|
||||
},
|
||||
"node_modules/postcss": {
|
||||
"version": "8.5.6",
|
||||
"resolved": "https://registry.npmjs.org/postcss/-/postcss-8.5.6.tgz",
|
||||
@@ -3655,23 +3180,6 @@
|
||||
"node": ">=6"
|
||||
}
|
||||
},
|
||||
"node_modules/qrcode": {
|
||||
"version": "1.5.4",
|
||||
"resolved": "https://registry.npmjs.org/qrcode/-/qrcode-1.5.4.tgz",
|
||||
"integrity": "sha512-1ca71Zgiu6ORjHqFBDpnSMTR2ReToX4l1Au1VFLyVeBTFavzQnv5JxMFr3ukHVKpSrSA2MCk0lNJSykjUfz7Zg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"dijkstrajs": "^1.0.1",
|
||||
"pngjs": "^5.0.0",
|
||||
"yargs": "^15.3.1"
|
||||
},
|
||||
"bin": {
|
||||
"qrcode": "bin/qrcode"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=10.13.0"
|
||||
}
|
||||
},
|
||||
"node_modules/react": {
|
||||
"version": "19.2.4",
|
||||
"resolved": "https://registry.npmjs.org/react/-/react-19.2.4.tgz",
|
||||
@@ -3755,21 +3263,6 @@
|
||||
"react-dom": ">=18"
|
||||
}
|
||||
},
|
||||
"node_modules/require-directory": {
|
||||
"version": "2.1.1",
|
||||
"resolved": "https://registry.npmjs.org/require-directory/-/require-directory-2.1.1.tgz",
|
||||
"integrity": "sha512-fGxEI7+wsG9xrvdjsrlmL22OMTTiHRwAMroiEeMgq8gzoLC/PQr7RsRDSTLUg/bZAZtF+TVIkHc6/4RIKrui+Q==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=0.10.0"
|
||||
}
|
||||
},
|
||||
"node_modules/require-main-filename": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/require-main-filename/-/require-main-filename-2.0.0.tgz",
|
||||
"integrity": "sha512-NKN5kMDylKuldxYLSUfrbo5Tuzh4hd+2E8NPPX02mZtn1VuREQToYe/ZdlJy+J3uCpfaiGF05e7B8W0iXbQHmg==",
|
||||
"license": "ISC"
|
||||
},
|
||||
"node_modules/resolve-from": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/resolve-from/-/resolve-from-4.0.0.tgz",
|
||||
@@ -3841,12 +3334,6 @@
|
||||
"semver": "bin/semver.js"
|
||||
}
|
||||
},
|
||||
"node_modules/set-blocking": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/set-blocking/-/set-blocking-2.0.0.tgz",
|
||||
"integrity": "sha512-KiKBS8AnWGEyLzofFfmvKwpdPzqiy16LvQfK3yv/fVH7Bj13/wl3JSR1J+rfgRE9q7xUJK4qvgS8raSOeLUehw==",
|
||||
"license": "ISC"
|
||||
},
|
||||
"node_modules/set-cookie-parser": {
|
||||
"version": "2.7.2",
|
||||
"resolved": "https://registry.npmjs.org/set-cookie-parser/-/set-cookie-parser-2.7.2.tgz",
|
||||
@@ -3876,17 +3363,6 @@
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/snake-case": {
|
||||
"version": "3.0.4",
|
||||
"resolved": "https://registry.npmjs.org/snake-case/-/snake-case-3.0.4.tgz",
|
||||
"integrity": "sha512-LAOh4z89bGQvl9pFfNF8V146i7o7/CqFPbqzYgP+yYzDIDeS9HaNFtXABamRW+AQzEVODcvE79ljJ+8a9YSdMg==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"dot-case": "^3.0.4",
|
||||
"tslib": "^2.0.3"
|
||||
}
|
||||
},
|
||||
"node_modules/source-map-js": {
|
||||
"version": "1.2.1",
|
||||
"resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.1.tgz",
|
||||
@@ -3897,32 +3373,6 @@
|
||||
"node": ">=0.10.0"
|
||||
}
|
||||
},
|
||||
"node_modules/string-width": {
|
||||
"version": "4.2.3",
|
||||
"resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
|
||||
"integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"emoji-regex": "^8.0.0",
|
||||
"is-fullwidth-code-point": "^3.0.0",
|
||||
"strip-ansi": "^6.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/strip-ansi": {
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz",
|
||||
"integrity": "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"ansi-regex": "^5.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/strip-json-comments": {
|
||||
"version": "3.1.1",
|
||||
"resolved": "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-3.1.1.tgz",
|
||||
@@ -3949,13 +3399,6 @@
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/svg-parser": {
|
||||
"version": "2.0.4",
|
||||
"resolved": "https://registry.npmjs.org/svg-parser/-/svg-parser-2.0.4.tgz",
|
||||
"integrity": "sha512-e4hG1hRwoOdRb37cIMSgzNsxyzKfayW6VOflrwvR+/bzrkyxY/31WkbgnQpgtrNp1SdpJvpUAGTa/ZoiPNDuRQ==",
|
||||
"dev": true,
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/tailwindcss": {
|
||||
"version": "4.1.18",
|
||||
"resolved": "https://registry.npmjs.org/tailwindcss/-/tailwindcss-4.1.18.tgz",
|
||||
@@ -4129,21 +3572,6 @@
|
||||
}
|
||||
}
|
||||
},
|
||||
"node_modules/vite-plugin-svgr": {
|
||||
"version": "4.5.0",
|
||||
"resolved": "https://registry.npmjs.org/vite-plugin-svgr/-/vite-plugin-svgr-4.5.0.tgz",
|
||||
"integrity": "sha512-W+uoSpmVkSmNOGPSsDCWVW/DDAyv+9fap9AZXBvWiQqrboJ08j2vh0tFxTD/LjwqwAd3yYSVJgm54S/1GhbdnA==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@rollup/pluginutils": "^5.2.0",
|
||||
"@svgr/core": "^8.1.0",
|
||||
"@svgr/plugin-jsx": "^8.1.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"vite": ">=2.6.0"
|
||||
}
|
||||
},
|
||||
"node_modules/which": {
|
||||
"version": "2.0.2",
|
||||
"resolved": "https://registry.npmjs.org/which/-/which-2.0.2.tgz",
|
||||
@@ -4160,12 +3588,6 @@
|
||||
"node": ">= 8"
|
||||
}
|
||||
},
|
||||
"node_modules/which-module": {
|
||||
"version": "2.0.1",
|
||||
"resolved": "https://registry.npmjs.org/which-module/-/which-module-2.0.1.tgz",
|
||||
"integrity": "sha512-iBdZ57RDvnOR9AGBhML2vFZf7h8vmBjhoaZqODJBFWHVtKkDmKuHai3cx5PgVMrX5YDNp27AofYbAwctSS+vhQ==",
|
||||
"license": "ISC"
|
||||
},
|
||||
"node_modules/word-wrap": {
|
||||
"version": "1.2.5",
|
||||
"resolved": "https://registry.npmjs.org/word-wrap/-/word-wrap-1.2.5.tgz",
|
||||
@@ -4176,26 +3598,6 @@
|
||||
"node": ">=0.10.0"
|
||||
}
|
||||
},
|
||||
"node_modules/wrap-ansi": {
|
||||
"version": "6.2.0",
|
||||
"resolved": "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-6.2.0.tgz",
|
||||
"integrity": "sha512-r6lPcBGxZXlIcymEu7InxDMhdW0KDxpLgoFLcguasxCaJ/SOIZwINatK9KY/tf+ZrlywOKU0UDj3ATXUBfxJXA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"ansi-styles": "^4.0.0",
|
||||
"string-width": "^4.1.0",
|
||||
"strip-ansi": "^6.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/y18n": {
|
||||
"version": "4.0.3",
|
||||
"resolved": "https://registry.npmjs.org/y18n/-/y18n-4.0.3.tgz",
|
||||
"integrity": "sha512-JKhqTOwSrqNA1NY5lSztJ1GrBiUodLMmIZuLiDaMRJ+itFd+ABVE8XBjOvIWL+rSqNDC74LCSFmlb/U4UZ4hJQ==",
|
||||
"license": "ISC"
|
||||
},
|
||||
"node_modules/yallist": {
|
||||
"version": "3.1.1",
|
||||
"resolved": "https://registry.npmjs.org/yallist/-/yallist-3.1.1.tgz",
|
||||
@@ -4203,102 +3605,6 @@
|
||||
"dev": true,
|
||||
"license": "ISC"
|
||||
},
|
||||
"node_modules/yargs": {
|
||||
"version": "15.4.1",
|
||||
"resolved": "https://registry.npmjs.org/yargs/-/yargs-15.4.1.tgz",
|
||||
"integrity": "sha512-aePbxDmcYW++PaqBsJ+HYUFwCdv4LVvdnhBy78E57PIor8/OVvhMrADFFEDh8DHDFRv/O9i3lPhsENjO7QX0+A==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"cliui": "^6.0.0",
|
||||
"decamelize": "^1.2.0",
|
||||
"find-up": "^4.1.0",
|
||||
"get-caller-file": "^2.0.1",
|
||||
"require-directory": "^2.1.1",
|
||||
"require-main-filename": "^2.0.0",
|
||||
"set-blocking": "^2.0.0",
|
||||
"string-width": "^4.2.0",
|
||||
"which-module": "^2.0.0",
|
||||
"y18n": "^4.0.0",
|
||||
"yargs-parser": "^18.1.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/yargs-parser": {
|
||||
"version": "18.1.3",
|
||||
"resolved": "https://registry.npmjs.org/yargs-parser/-/yargs-parser-18.1.3.tgz",
|
||||
"integrity": "sha512-o50j0JeToy/4K6OZcaQmW6lyXXKhq7csREXcDwk2omFPJEwUNOVtJKvmDr9EI1fAJZUyZcRF7kxGBWmRXudrCQ==",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"camelcase": "^5.0.0",
|
||||
"decamelize": "^1.2.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=6"
|
||||
}
|
||||
},
|
||||
"node_modules/yargs-parser/node_modules/camelcase": {
|
||||
"version": "5.3.1",
|
||||
"resolved": "https://registry.npmjs.org/camelcase/-/camelcase-5.3.1.tgz",
|
||||
"integrity": "sha512-L28STB170nwWS63UjtlEOE3dldQApaJXZkOI1uMFfzf3rRuPegHaHesyee+YxQ+W6SvRDQV6UrdOdRiR153wJg==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=6"
|
||||
}
|
||||
},
|
||||
"node_modules/yargs/node_modules/find-up": {
|
||||
"version": "4.1.0",
|
||||
"resolved": "https://registry.npmjs.org/find-up/-/find-up-4.1.0.tgz",
|
||||
"integrity": "sha512-PpOwAdQ/YlXQ2vj8a3h8IipDuYRi3wceVQQGYWxNINccq40Anw7BlsEXCMbt1Zt+OLA6Fq9suIpIWD0OsnISlw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"locate-path": "^5.0.0",
|
||||
"path-exists": "^4.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/yargs/node_modules/locate-path": {
|
||||
"version": "5.0.0",
|
||||
"resolved": "https://registry.npmjs.org/locate-path/-/locate-path-5.0.0.tgz",
|
||||
"integrity": "sha512-t7hw9pI+WvuwNJXwk5zVHpyhIqzg2qTlklJOf0mVxGSbe3Fp2VieZcduNYjaLDoy6p9uGpQEGWG87WpMKlNq8g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"p-locate": "^4.1.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/yargs/node_modules/p-limit": {
|
||||
"version": "2.3.0",
|
||||
"resolved": "https://registry.npmjs.org/p-limit/-/p-limit-2.3.0.tgz",
|
||||
"integrity": "sha512-//88mFWSJx8lxCzwdAABTJL2MyWB12+eIY7MDL2SqLmAkeKU9qxRvWuSyTjm3FUmpBEMuFfckAIqEaVGUDxb6w==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"p-try": "^2.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=6"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/yargs/node_modules/p-locate": {
|
||||
"version": "4.1.0",
|
||||
"resolved": "https://registry.npmjs.org/p-locate/-/p-locate-4.1.0.tgz",
|
||||
"integrity": "sha512-R79ZZ/0wAxKGu3oYMlz8jy/kbhsNrS7SKZ7PxEHBgJ5+F2mtFW2fK2cOtBh1cHYkQsbzFV7I+EoRKe6Yt0oK7A==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"p-limit": "^2.2.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/yocto-queue": {
|
||||
"version": "0.1.0",
|
||||
"resolved": "https://registry.npmjs.org/yocto-queue/-/yocto-queue-0.1.0.tgz",
|
||||
|
||||
@@ -12,7 +12,6 @@
|
||||
"dependencies": {
|
||||
"esptool-js": "^0.5.7",
|
||||
"leaflet": "^1.9.4",
|
||||
"qrcode": "^1.5.4",
|
||||
"react": "^19.2.0",
|
||||
"react-dom": "^19.2.0",
|
||||
"react-leaflet": "^5.0.0",
|
||||
@@ -29,7 +28,6 @@
|
||||
"eslint-plugin-react-refresh": "^0.4.24",
|
||||
"globals": "^16.5.0",
|
||||
"tailwindcss": "^4.1.18",
|
||||
"vite": "^7.3.1",
|
||||
"vite-plugin-svgr": "^4.5.0"
|
||||
"vite": "^7.3.1"
|
||||
}
|
||||
}
|
||||
|
||||
|
Before Width: | Height: | Size: 25 KiB |
|
Before Width: | Height: | Size: 16 KiB |
|
Before Width: | Height: | Size: 14 KiB |
@@ -1,9 +1,5 @@
|
||||
import { Routes, Route, Navigate } from "react-router-dom";
|
||||
import { useAuth } from "./auth/AuthContext";
|
||||
import CloudFlashPage from "./cloudflash/CloudFlashPage";
|
||||
import SerialMonitorPage from "./serial/SerialMonitorPage";
|
||||
import SerialLogViewer from "./serial/SerialLogViewer";
|
||||
import PublicFeaturesSettings from "./settings/PublicFeaturesSettings";
|
||||
import LoginPage from "./auth/LoginPage";
|
||||
import MainLayout from "./layout/MainLayout";
|
||||
import MelodyList from "./melodies/MelodyList";
|
||||
@@ -37,8 +33,8 @@ import DashboardPage from "./dashboard/DashboardPage";
|
||||
import ApiReferencePage from "./developer/ApiReferencePage";
|
||||
import { ProductList, ProductForm } from "./crm/products";
|
||||
import { CustomerList, CustomerForm, CustomerDetail } from "./crm/customers";
|
||||
import { OrderList } from "./crm/orders";
|
||||
import { QuotationForm, AllQuotationsList } from "./crm/quotations";
|
||||
import { OrderList, OrderForm, OrderDetail } from "./crm/orders";
|
||||
import { QuotationForm } from "./crm/quotations";
|
||||
import CommsPage from "./crm/inbox/CommsPage";
|
||||
import MailPage from "./crm/mail/MailPage";
|
||||
|
||||
@@ -110,10 +106,6 @@ function RoleGate({ roles, children }) {
|
||||
export default function App() {
|
||||
return (
|
||||
<Routes>
|
||||
{/* Public routes — no login required */}
|
||||
<Route path="/cloudflash" element={<CloudFlashPage />} />
|
||||
<Route path="/serial-monitor" element={<SerialMonitorPage />} />
|
||||
|
||||
<Route path="/login" element={<LoginPage />} />
|
||||
<Route
|
||||
element={
|
||||
@@ -179,7 +171,9 @@ export default function App() {
|
||||
<Route path="crm/customers/:id" element={<PermissionGate section="crm"><CustomerDetail /></PermissionGate>} />
|
||||
<Route path="crm/customers/:id/edit" element={<PermissionGate section="crm" action="edit"><CustomerForm /></PermissionGate>} />
|
||||
<Route path="crm/orders" element={<PermissionGate section="crm"><OrderList /></PermissionGate>} />
|
||||
<Route path="crm/quotations" element={<PermissionGate section="crm"><AllQuotationsList /></PermissionGate>} />
|
||||
<Route path="crm/orders/new" element={<PermissionGate section="crm" action="edit"><OrderForm /></PermissionGate>} />
|
||||
<Route path="crm/orders/:id" element={<PermissionGate section="crm"><OrderDetail /></PermissionGate>} />
|
||||
<Route path="crm/orders/:id/edit" element={<PermissionGate section="crm" action="edit"><OrderForm /></PermissionGate>} />
|
||||
<Route path="crm/quotations/new" element={<PermissionGate section="crm" action="edit"><QuotationForm /></PermissionGate>} />
|
||||
<Route path="crm/quotations/:id" element={<PermissionGate section="crm" action="edit"><QuotationForm /></PermissionGate>} />
|
||||
|
||||
@@ -193,12 +187,6 @@ export default function App() {
|
||||
<Route path="settings/staff/:id" element={<RoleGate roles={["sysadmin", "admin"]}><StaffDetail /></RoleGate>} />
|
||||
<Route path="settings/staff/:id/edit" element={<RoleGate roles={["sysadmin", "admin"]}><StaffForm /></RoleGate>} />
|
||||
|
||||
{/* Settings - Public Features */}
|
||||
<Route path="settings/public-features" element={<RoleGate roles={["sysadmin", "admin"]}><PublicFeaturesSettings /></RoleGate>} />
|
||||
|
||||
{/* Settings - Serial Log Viewer */}
|
||||
<Route path="settings/serial-logs" element={<RoleGate roles={["sysadmin", "admin"]}><SerialLogViewer /></RoleGate>} />
|
||||
|
||||
<Route path="*" element={<Navigate to="/" replace />} />
|
||||
</Route>
|
||||
</Routes>
|
||||
|
||||
@@ -1,6 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg id="Layer_2" data-name="Layer 2" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 44.73 44.73">
|
||||
<g id="Layer_1-2" data-name="Layer 1">
|
||||
<path d="m4.09,8.19v32.45h32.45v4.09H4.09c-1.1,0-2.06-.41-2.87-1.22-.81-.81-1.22-1.77-1.22-2.87V8.19h4.09ZM40.64,0c1.1,0,2.06.41,2.87,1.22.81.81,1.22,1.77,1.22,2.87v28.46c0,1.1-.41,2.05-1.22,2.83-.81.78-1.77,1.17-2.87,1.17H12.18c-1.1,0-2.05-.39-2.83-1.17-.78-.78-1.17-1.72-1.17-2.83V4.09c0-1.1.39-2.06,1.17-2.87.78-.81,1.72-1.22,2.83-1.22h28.46Zm0,32.55V4.09H12.18v28.46h28.46Zm-8.09-8.19c0,1.14-.41,2.1-1.22,2.9-.81.8-1.77,1.19-2.87,1.19h-8.09v-4.09h8.09v-4h-4.09v-4.09h4.09v-4.09h-8.09v-4h8.09c1.1,0,2.06.38,2.87,1.15.81.76,1.22,1.71,1.22,2.85v3.02c0,.88-.29,1.61-.88,2.19-.58.58-1.3.88-2.14.88.84,0,1.56.29,2.14.88.58.58.88,1.3.88,2.14v3.07Z"/>
|
||||
</g>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 845 B |
@@ -1,6 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg id="Layer_2" data-name="Layer 2" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 44.73 44.73">
|
||||
<g id="Layer_1-2" data-name="Layer 1">
|
||||
<path d="m4.09,8.19v32.45h32.45v4.09H4.09c-1.1,0-2.06-.41-2.87-1.22-.81-.81-1.22-1.77-1.22-2.87V8.19h4.09ZM40.64,0c1.1,0,2.06.41,2.87,1.22.81.81,1.22,1.77,1.22,2.87v28.46c0,1.1-.41,2.05-1.22,2.83-.81.78-1.77,1.17-2.87,1.17H12.18c-1.1,0-2.05-.39-2.83-1.17-.78-.78-1.17-1.72-1.17-2.83V4.09c0-1.1.39-2.06,1.17-2.87.78-.81,1.72-1.22,2.83-1.22h28.46Zm0,32.55V4.09H12.18v28.46h28.46Zm-16.27-4.09c-1.1,0-2.05-.4-2.83-1.19-.78-.8-1.17-1.76-1.17-2.9v-12.18c0-1.14.39-2.09,1.17-2.85.78-.76,1.72-1.15,2.83-1.15h8.19v4h-8.19v4.09h4.09c1.1,0,2.06.4,2.87,1.19.81.8,1.22,1.76,1.22,2.9v4c0,1.14-.41,2.1-1.22,2.9-.81.8-1.77,1.19-2.87,1.19h-4.09Zm0-8.09v4h4.09v-4h-4.09Z"/>
|
||||
</g>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 853 B |
@@ -1,6 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg id="Layer_2" data-name="Layer 2" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 44.68 44.68">
|
||||
<g id="Layer_1-2" data-name="Layer 1">
|
||||
<path d="m4.09,8.18v32.41h32.41v4.09H4.09c-1.1,0-2.06-.41-2.87-1.22-.81-.81-1.22-1.77-1.22-2.87V8.18h4.09ZM40.59,0c1.1,0,2.06.41,2.87,1.22.81.81,1.22,1.77,1.22,2.87v28.42c0,1.1-.41,2.04-1.22,2.82-.81.78-1.77,1.17-2.87,1.17H12.17c-1.1,0-2.04-.39-2.82-1.17s-1.17-1.72-1.17-2.82V4.09c0-1.1.39-2.06,1.17-2.87.78-.81,1.72-1.22,2.82-1.22h28.42Zm0,32.51V4.09H12.17v28.42h28.42Zm-12.17-24.33c1.1,0,2.06.38,2.87,1.14.81.76,1.22,1.71,1.22,2.85v12.17c0,1.14-.41,2.1-1.22,2.9-.81.8-1.77,1.19-2.87,1.19h-8.08v-4.09h8.08v-3.99h-4.09c-1.1,0-2.04-.4-2.82-1.19-.78-.79-1.17-1.76-1.17-2.9v-4.09c0-1.14.39-2.08,1.17-2.85.78-.76,1.72-1.14,2.82-1.14h4.09Zm0,8.08v-4.09h-4.09v4.09h4.09Z"/>
|
||||
</g>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 865 B |
@@ -1,9 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg id="Layer_2" data-name="Layer 2" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 44.14 44.14">
|
||||
<g id="Layer_1-2" data-name="Layer 1">
|
||||
<g>
|
||||
<path d="m24.72,10.46v13.65c0,.25-.09.47-.27.66-.18.18-.42.27-.71.27h-9.72c-.29,0-.53-.09-.71-.27-.18-.18-.27-.4-.27-.66v-1.97c0-.29.09-.53.27-.71.18-.18.42-.27.71-.27h6.83v-10.71c0-.29.09-.53.27-.71s.4-.27.66-.27h1.97c.29,0,.53.09.71.27.18.18.27.42.27.71Z"/>
|
||||
<path d="m0,22.07c0-6.09,2.16-11.3,6.49-15.62C10.81,2.12,16.01-.03,22.07,0c6.06.03,11.27,2.18,15.62,6.44,4.35,4.27,6.5,9.48,6.44,15.62-.06,6.15-2.21,11.36-6.44,15.62-4.24,4.27-9.45,6.41-15.62,6.44-6.18.03-11.37-2.12-15.58-6.44C2.28,33.37.12,28.16,0,22.07Zm4.81,0c0,4.77,1.69,8.83,5.08,12.18,3.38,3.35,7.44,5.05,12.18,5.08,4.74.03,8.8-1.66,12.18-5.08,3.38-3.41,5.08-7.47,5.08-12.18s-1.69-8.77-5.08-12.18c-3.38-3.41-7.44-5.1-12.18-5.08-4.74.03-8.8,1.72-12.18,5.08-3.38,3.35-5.08,7.42-5.08,12.18Z"/>
|
||||
</g>
|
||||
</g>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 979 B |
@@ -1,9 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg id="Layer_2" data-name="Layer 2" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 44.14 44.14">
|
||||
<g id="Layer_1-2" data-name="Layer 1">
|
||||
<g>
|
||||
<path d="m0,22.07c0-6.09,2.16-11.3,6.49-15.62C10.81,2.12,16.01-.03,22.07,0c6.06.03,11.27,2.18,15.62,6.44,4.35,4.27,6.5,9.48,6.44,15.62-.06,6.15-2.21,11.36-6.44,15.62s-9.45,6.41-15.62,6.44c-6.18.03-11.37-2.12-15.58-6.44C2.28,33.37.12,28.16,0,22.07Zm4.81,0c0,4.77,1.69,8.83,5.08,12.18,3.38,3.35,7.44,5.05,12.18,5.08,4.74.03,8.8-1.66,12.18-5.08,3.38-3.41,5.08-7.47,5.08-12.18s-1.69-8.77-5.08-12.18c-3.38-3.41-7.44-5.1-12.18-5.08-4.74.03-8.8,1.72-12.18,5.08-3.38,3.35-5.08,7.42-5.08,12.18Z"/>
|
||||
<path d="m23.69,30.21c1.22-.03,2.25-.5,3.1-1.38.85-.89,1.28-1.96,1.28-3.21,0-1.01-.3-1.93-.91-2.77-.61-.83-1.38-1.39-2.32-1.67l-4.59-1.36c-.28-.07-.5-.23-.65-.47-.16-.24-.23-.54-.23-.89s.11-.65.34-.91c.23-.26.51-.39.86-.39h2.87c.42,0,.85.14,1.3.42.14.1.3.15.5.13.19-.02.37-.1.55-.23l1.2-1.15c.17-.14.25-.34.23-.6-.02-.26-.11-.46-.29-.6-.9-.7-1.98-1.08-3.23-1.15v-2.5c0-.24-.08-.44-.23-.6s-.34-.23-.55-.23h-1.67c-.21,0-.39.08-.55.23s-.23.36-.23.6v2.45c-1.22.04-2.25.5-3.1,1.38-.85.89-1.28,1.96-1.28,3.21,0,1.01.3,1.93.91,2.77.61.83,1.38,1.39,2.32,1.67l4.59,1.36c.24.07.45.23.63.5.17.26.26.53.26.81,0,.38-.11.7-.34.97-.23.26-.51.39-.86.39h-2.87c-.38,0-.82-.12-1.3-.37-.17-.1-.36-.16-.55-.16s-.36.07-.5.21l-1.2,1.15c-.17.17-.25.39-.23.65.02.26.13.46.34.6.94.7,2,1.08,3.18,1.15v2.45c0,.24.08.44.23.6.16.16.34.23.55.23h1.67c.21,0,.39-.08.55-.23.16-.16.23-.36.23-.6v-2.45Z"/>
|
||||
</g>
|
||||
</g>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 1.5 KiB |
@@ -1,10 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg id="Layer_2" data-name="Layer 2" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 48.38 43">
|
||||
<g id="Layer_1-2" data-name="Layer 1">
|
||||
<g>
|
||||
<path d="m32.25,10.23c0-.54-.2-1.02-.6-1.42L23.43.6c-.4-.4-.87-.6-1.42-.6h-.52v10.75h10.75v-.52Z"/>
|
||||
<path d="m19.4,29.15c-.26-.26-.39-.57-.39-.95v-2.67c0-.37.13-.69.39-.97.26-.27.57-.41.95-.41h11.9v-10.75h-11.44c-.54,0-1.01-.19-1.4-.58-.39-.39-.58-.85-.58-1.4V0H2.02c-.54,0-1.02.19-1.42.58-.4.39-.6.87-.6,1.44v38.96c0,.54.2,1.02.6,1.42s.87.6,1.42.6h28.21c.57,0,1.05-.2,1.44-.6.39-.4.58-.87.58-1.42v-11.44h-11.9c-.37,0-.69-.13-.95-.39Zm-5.34-2.56c-.7.73-1.56,1.11-2.56,1.14v2.02c0,.2-.06.37-.19.49-.13.13-.28.19-.45.19h-1.38c-.17,0-.32-.06-.45-.19-.13-.13-.19-.29-.19-.49v-2.02c-.97-.06-1.85-.37-2.62-.95-.17-.11-.27-.28-.28-.49-.01-.22.05-.39.19-.54l.99-.95c.11-.11.25-.17.41-.17s.31.04.45.13c.4.2.76.3,1.07.3h2.37c.29,0,.52-.11.71-.32.19-.22.28-.48.28-.8,0-.23-.07-.45-.21-.67-.14-.22-.32-.35-.52-.41l-3.78-1.12c-.77-.23-1.41-.69-1.91-1.38-.5-.69-.75-1.45-.75-2.28,0-1.03.35-1.91,1.05-2.64.7-.73,1.55-1.11,2.56-1.14v-2.02c0-.2.06-.37.19-.49.13-.13.28-.19.45-.19h1.38c.17,0,.32.06.45.19.13.13.19.29.19.49v2.06c1.03.06,1.92.37,2.67.95.14.11.22.28.24.49.01.22-.05.38-.19.49l-.99.95c-.14.11-.29.18-.45.19-.16.01-.29-.02-.41-.11-.37-.23-.73-.34-1.07-.34h-2.37c-.29,0-.52.11-.71.32-.19.21-.28.47-.28.75s.06.53.19.73c.13.2.31.33.54.39l3.78,1.12c.77.23,1.41.69,1.91,1.38.5.69.75,1.45.75,2.28,0,1.03-.35,1.91-1.05,2.64Z"/>
|
||||
<path d="m47.95,25.89l-8.04-8.13c-.26-.26-.57-.39-.95-.39s-.69.13-.95.39c-.26.26-.39.57-.39.95v5.46h-5.37v5.37h5.37v5.5c0,.4.14.72.41.97.27.24.59.37.95.39.36.01.67-.12.92-.41l8.04-8.13c.29-.29.43-.62.43-1.01s-.14-.71-.43-.97Z"/>
|
||||
</g>
|
||||
</g>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 1.7 KiB |
@@ -1,6 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg id="Layer_2" data-name="Layer 2" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 43.63 43.63">
|
||||
<g id="Layer_1-2" data-name="Layer 1">
|
||||
<path d="m2.52,0l41.1,41.15-2.52,2.48-11.34-11.34H11.91l-7.91,7.91V6.53L0,2.52,2.52,0Zm37.2.57c1.08,0,2,.38,2.76,1.14s1.14,1.68,1.14,2.76v23.81c0,1.08-.37,2.01-1.1,2.79-.73.78-1.64,1.18-2.71,1.21l-13.86-13.86h9.76v-4h-13.76l-1.95-1.95h15.72v-4h-15.91v3.81L8.1.57h31.63ZM11.91,18.43h4l-4-4v4Zm4,5.95v-4h-4v4h4Z"/>
|
||||
</g>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 510 B |
@@ -1,5 +0,0 @@
|
||||
<?xml version="1.0" encoding="utf-8"?><!-- Uploaded to: SVG Repo, www.svgrepo.com, Generator: SVG Repo Mixer Tools -->
|
||||
<svg width="800px" height="800px" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M8 7C9.65685 7 11 5.65685 11 4C11 2.34315 9.65685 1 8 1C6.34315 1 5 2.34315 5 4C5 5.65685 6.34315 7 8 7Z" fill="currentColor"/>
|
||||
<path d="M14 12C14 10.3431 12.6569 9 11 9H5C3.34315 9 2 10.3431 2 12V15H14V12Z" fill="currentColor"/>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 466 B |
@@ -1,6 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg id="Layer_2" data-name="Layer 2" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 44.14 44.14">
|
||||
<g id="Layer_1-2" data-name="Layer 1">
|
||||
<path d="m0,22.07c0-6.09,2.16-11.3,6.49-15.62C10.81,2.12,16.01-.03,22.07,0c6.06.03,11.27,2.18,15.62,6.44,4.35,4.27,6.5,9.48,6.44,15.62-.06,6.15-2.21,11.36-6.44,15.62-4.24,4.27-9.45,6.41-15.62,6.44-6.18.03-11.37-2.12-15.58-6.44C2.28,33.37.12,28.16,0,22.07Zm4.81,0c0,4.77,1.69,8.83,5.08,12.18,3.38,3.35,7.44,5.05,12.18,5.08,4.74.03,8.8-1.66,12.18-5.08,3.38-3.41,5.08-7.47,5.08-12.18s-1.69-8.77-5.08-12.18c-3.38-3.41-7.44-5.1-12.18-5.08-4.74.03-8.8,1.72-12.18,5.08-3.38,3.35-5.08,7.42-5.08,12.18Zm6.4,6.05l6.05-6.05-6.05-6.05,4.81-4.81,6.05,6.05,6.05-6.05,4.81,4.81-6.05,6.05,6.05,6.05-4.81,4.81-6.05-6.05-6.05,6.05-4.81-4.81Z"/>
|
||||
</g>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 824 B |
@@ -1,6 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg id="Layer_2" data-name="Layer 2" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 44 39.16">
|
||||
<g id="Layer_1-2" data-name="Layer 1">
|
||||
<path d="m43.51,33.64c.44.81.59,1.64.43,2.51-.16.86-.57,1.58-1.23,2.15-.67.57-1.46.86-2.37.86H3.66c-.91,0-1.7-.29-2.37-.86-.67-.57-1.08-1.29-1.23-2.15-.16-.86-.01-1.7.43-2.51L18.81,1.86c.44-.81,1.09-1.36,1.94-1.64.85-.29,1.69-.29,2.52,0,.83.29,1.47.84,1.92,1.64l18.32,31.78Zm-21.49-6.58c-.97,0-1.79.35-2.49,1.04-.69.69-1.04,1.52-1.04,2.49s.35,1.79,1.04,2.49c.69.69,1.52,1.04,2.49,1.04s1.79-.34,2.47-1.02c.68-.68,1.02-1.51,1.02-2.49s-.34-1.81-1.02-2.51c-.68-.69-1.5-1.04-2.47-1.04Zm-3.37-12.64l.59,10.41c0,.23.09.44.27.61.18.17.39.25.63.25h3.72c.23,0,.44-.08.63-.25.18-.17.27-.37.27-.61l.59-10.41c.03-.26-.05-.48-.23-.67-.18-.18-.42-.27-.7-.27h-4.81c-.26,0-.48.09-.67.27-.18.18-.27.4-.27.67Z"/>
|
||||
</g>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 888 B |
@@ -1,6 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg id="Layer_2" data-name="Layer 2" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 44.14 44.14">
|
||||
<g id="Layer_1-2" data-name="Layer 1">
|
||||
<path d="m0,22.07c0-6.09,2.16-11.3,6.49-15.62C10.81,2.12,16.01-.03,22.07,0c6.06.03,11.27,2.18,15.62,6.44,4.35,4.27,6.5,9.48,6.44,15.62-.06,6.15-2.21,11.36-6.44,15.62s-9.45,6.41-15.62,6.44c-6.18.03-11.37-2.12-15.58-6.44C2.28,33.37.12,28.16,0,22.07Zm4.81,0c0,4.77,1.69,8.83,5.08,12.18,3.38,3.35,7.44,5.05,12.18,5.08,4.74.03,8.8-1.66,12.18-5.08,3.38-3.41,5.08-7.47,5.08-12.18s-1.69-8.77-5.08-12.18c-3.38-3.41-7.44-5.1-12.18-5.08-4.74.03-8.8,1.72-12.18,5.08-3.38,3.35-5.08,7.42-5.08,12.18Zm4.68,2.25l4.15-4.06,4.1,4.15,12.76-12.84,4.15,4.1-12.84,12.76-4.06,4.15-4.1-4.15-4.15-4.1Z"/>
|
||||
</g>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 777 B |
@@ -1,6 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg id="Layer_2" data-name="Layer 2" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 41.48 41.48">
|
||||
<g id="Layer_1-2" data-name="Layer 1">
|
||||
<path d="m31.16,20.74c0,.57-.21,1.05-.62,1.45-.42.4-.92.6-1.52.6H8.28L0,31.16V2.04C0,1.48.2,1,.6.6c.4-.4.88-.6,1.45-.6h26.97c.6,0,1.1.2,1.52.6.42.4.62.88.62,1.45v18.7Zm8.28-12.46c.56,0,1.05.2,1.45.6.4.4.6.88.6,1.45v31.16l-8.28-8.28H10.32c-.57,0-1.05-.2-1.45-.6s-.6-.88-.6-1.45v-4.19h26.97V8.28h4.19Z"/>
|
||||
</g>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 500 B |
@@ -1,4 +0,0 @@
|
||||
<?xml version="1.0" encoding="utf-8"?><!-- Uploaded to: SVG Repo, www.svgrepo.com, Generator: SVG Repo Mixer Tools -->
|
||||
<svg width="800px" height="800px" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path fill-rule="evenodd" clip-rule="evenodd" d="M5.58579 4.58579C5 5.17157 5 6.11438 5 8V17C5 18.8856 5 19.8284 5.58579 20.4142C6.17157 21 7.11438 21 9 21H15C16.8856 21 17.8284 21 18.4142 20.4142C19 19.8284 19 18.8856 19 17V8C19 6.11438 19 5.17157 18.4142 4.58579C17.8284 4 16.8856 4 15 4H9C7.11438 4 6.17157 4 5.58579 4.58579ZM9 8C8.44772 8 8 8.44772 8 9C8 9.55228 8.44772 10 9 10H15C15.5523 10 16 9.55228 16 9C16 8.44772 15.5523 8 15 8H9ZM9 12C8.44772 12 8 12.4477 8 13C8 13.5523 8.44772 14 9 14H15C15.5523 14 16 13.5523 16 13C16 12.4477 15.5523 12 15 12H9ZM9 16C8.44772 16 8 16.4477 8 17C8 17.5523 8.44772 18 9 18H13C13.5523 18 14 17.5523 14 17C14 16.4477 13.5523 16 13 16H9Z" fill="currentColor"/>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 930 B |
@@ -1,10 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg id="Layer_2" data-name="Layer 2" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 53.85 41.66">
|
||||
<g id="Layer_1-2" data-name="Layer 1">
|
||||
<g>
|
||||
<path d="m39.43,33.73l.04-.51.72.55c1.2.88,2.82.73,3.84-.36.91-.97.88-2.36-.08-3.54-4.03-4.91-8.06-9.82-12.09-14.72l-3.61-4.4q-.5-.6-1.18-.22l-1.56.87c-1.59.89-3.17,1.78-4.77,2.65-2.53,1.37-5.1,1.54-7.65.51-1.77-.72-2.79-2.07-2.94-3.89-.08-.95.1-1.87.54-2.84.13-.28.15-.5.07-.64-.08-.14-.3-.23-.6-.25-.12,0-.23,0-.35,0H1.75C.36,6.94,0,7.31,0,8.71,0,14.45,0,20.19,0,25.93c0,.21,0,.46.04.68.1.65.48.99,1.14,1.01.46.01.91.01,1.37,0,.36,0,.72,0,1.08,0,.2.02.32-.06.45-.22.31-.39.63-.77.94-1.16.22-.26.43-.52.64-.78,1.1-1.35,2.65-2.11,4.26-2.11.24,0,.48.02.72.05,1.85.26,3.42,1.52,4.2,3.35.11.27.24.39.47.48,1.48.53,2.49,1.55,3.03,3.01.1.28.25.42.55.54,1.01.4,1.78,1.03,2.31,1.87.12.18.26.28.54.36,1.67.45,2.86,1.44,3.52,2.95.72,1.64.64,3.29-.25,4.9-.16.28-.18.38-.18.42.02.03.16.1.41.19.72.27,1.52.22,2.21-.14.68-.36,1.17-.99,1.34-1.73.05-.23.08-.46.12-.7.02-.12.03-.24.05-.37l.08-.51.38.35c.87.79,2.07,1.04,3.07.63,1.02-.42,1.64-1.45,1.7-2.81l.02-.56.43.36c1.05.88,2.09,1.07,3.18.58,1.01-.45,1.5-1.3,1.61-2.84Z"/>
|
||||
<path d="m17.3,6.35c-1.11.85-2.22,1.7-3.34,2.54-.6.45-.82.96-.72,1.61.11.65.48,1.06,1.19,1.32,1.75.62,3.4.46,5.06-.5,2.23-1.29,4.52-2.57,6.72-3.81l.74-.42c.44-.24.82-.37,1.18-.37.54,0,1.02.29,1.51.88l17.11,20.77c.07.08.28.34.38.35.06,0,.16-.04.4-.26.7-.64,1.51-.93,2.43-.87.87.05,1.74.03,2.59,0,.73-.02,1.13-.37,1.24-1.1.04-.27.04-.55.04-.79,0-2.26,0-4.52,0-6.77v-4.43c0-2.02,0-4.05,0-6.07,0-1.1-.38-1.47-1.5-1.48h-.13c-.66,0-1.32-.01-1.98,0-1.08,0-2.11-.16-3.1-.54l-2.48-.95c-4.16-1.6-8.33-3.2-12.5-4.77-3.24-1.22-6.29-.79-9.06,1.28-1.94,1.45-3.87,2.92-5.8,4.4Z"/>
|
||||
<path d="m12.13,28.29c-.13-.93-.63-1.55-1.46-1.83-.93-.31-1.75-.08-2.43.69-.51.57-1,1.19-1.47,1.78l-.46.58c-.7.87-.78,1.95-.19,2.82.68,1,1.76,1.4,2.84,1.03.21-.07.44-.13.62,0,.21.14.21.41.22.6.02.79.35,1.47.92,1.92.58.46,1.34.61,2.13.44.21-.05.43-.07.62.05.18.13.24.35.27.57.1.76.48,1.39,1.04,1.78.55.38,1.24.49,1.95.32.13-.03.24-.05.35-.05.3,0,.53.15.69.6.42,1.16,1.68,1.82,2.85,1.51,1.06-.29,2.07-1.74,2.07-3,0-.72-.28-1.34-.8-1.77-.52-.43-1.19-.58-1.89-.45-.55.11-.83-.07-.96-.61-.31-1.38-1.31-2.04-2.74-1.82-.21.03-.48.05-.64-.14-.19-.21-.1-.51-.07-.64.09-.33.1-.68.02-1.03-.13-.62-.48-1.13-.97-1.44-.54-.33-1.22-.41-1.92-.23-.02,0-.04.02-.07.03-.11.05-.38.17-.61-.06-.24-.24-.11-.52-.05-.62.12-.41.2-.72.15-1.04Z"/>
|
||||
</g>
|
||||
</g>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 2.5 KiB |
@@ -1,25 +0,0 @@
|
||||
<?xml version="1.0" encoding="utf-8"?>
|
||||
|
||||
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
|
||||
<!-- Uploaded to: SVG Repo, www.svgrepo.com, Generator: SVG Repo Mixer Tools -->
|
||||
<svg height="800px" width="800px" version="1.1" id="_x32_" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink"
|
||||
viewBox="0 0 512 512" xml:space="preserve">
|
||||
<style type="text/css">
|
||||
.st0{fill:currentColor;}
|
||||
</style>
|
||||
<g>
|
||||
<path class="st0" d="M116.713,337.355c-20.655,0-37.456,16.801-37.456,37.456c0,20.655,16.802,37.455,37.456,37.455
|
||||
c20.649,0,37.448-16.8,37.448-37.455C154.161,354.156,137.362,337.355,116.713,337.355z"/>
|
||||
<path class="st0" d="M403.81,337.355c-20.649,0-37.449,16.801-37.449,37.456c0,20.655,16.8,37.455,37.449,37.455
|
||||
c20.649,0,37.45-16.8,37.45-37.455C441.261,354.156,424.459,337.355,403.81,337.355z"/>
|
||||
<path class="st0" d="M497.571,99.735H252.065c-7.974,0-14.429,6.466-14.429,14.44v133.818c0,7.972,6.455,14.428,14.429,14.428
|
||||
h245.506c7.966,0,14.429-6.456,14.429-14.428V114.174C512,106.201,505.538,99.735,497.571,99.735z"/>
|
||||
<path class="st0" d="M499.966,279.409H224.225c-6.64,0-12.079-5.439-12.079-12.079V111.739c0-6.638-5.359-11.999-11.999-11.999
|
||||
H90.554c-3.599,0-6.96,1.602-9.281,4.32L2.801,198.213C1.039,200.373,0,203.094,0,205.893v125.831
|
||||
c0,6.64,5.439,11.999,12.079,11.999h57.516c10.08-15.358,27.438-25.438,47.118-25.438c19.678,0,37.036,10.08,47.116,25.438h192.868
|
||||
c10.079-15.358,27.438-25.438,47.116-25.438c19.678,0,37.039,10.08,47.118,25.438h49.036c6.64,0,11.999-5.359,11.999-11.999
|
||||
v-40.316C511.965,284.768,506.606,279.409,499.966,279.409z M43.997,215.493v-8.32c0-2.881,0.961-5.601,2.72-7.84l50.157-61.675
|
||||
c2.318-2.881,5.839-4.56,9.599-4.56h49.116c6.8,0,12.4,5.519,12.4,12.4v69.995c0,6.798-5.599,12.398-12.4,12.398H56.396
|
||||
C49.516,227.891,43.997,222.292,43.997,215.493z"/>
|
||||
</g>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 1.8 KiB |
@@ -1,10 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg id="Layer_2" data-name="Layer 2" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 41.05 41.09">
|
||||
<g id="Layer_1-2" data-name="Layer 1">
|
||||
<g>
|
||||
<path d="m0,41.09h41.05V15.4H0v25.69Zm13.42-14.41l3.62,3.66,11.24-11.32,3.66,3.62-11.32,11.24-3.58,3.66-3.62-3.66-3.66-3.62,3.66-3.58Z"/>
|
||||
<polygon points="35.9 0 23.12 0 23.12 10.25 41.05 10.25 35.9 0"/>
|
||||
<polygon points="17.97 0 5.15 0 0 10.25 17.97 10.25 17.97 0"/>
|
||||
</g>
|
||||
</g>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 495 B |
@@ -1,6 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg id="Layer_2" data-name="Layer 2" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 44.14 44.14">
|
||||
<g id="Layer_1-2" data-name="Layer 1">
|
||||
<path d="m0,22.07c0-6.09,2.16-11.3,6.49-15.62C10.81,2.12,16.01-.03,22.07,0c6.06.03,11.27,2.18,15.62,6.44,4.35,4.27,6.5,9.48,6.44,15.62-.06,6.15-2.21,11.36-6.44,15.62-4.24,4.27-9.45,6.41-15.62,6.44-6.18.03-11.37-2.12-15.58-6.44C2.28,33.37.12,28.16,0,22.07Zm8.78,8.87l4.41,4.41,12.49-12.54c1.21.47,2.52.56,3.93.26,1.41-.29,2.62-.94,3.62-1.94,1-1,1.65-2.19,1.94-3.58.29-1.38.21-2.69-.26-3.93l-4.41,4.37-3.18-1.19-1.19-3.18,4.37-4.41c-.74-.29-1.53-.44-2.38-.44-2.03,0-3.72.72-5.08,2.16-1.06.97-1.72,2.16-1.99,3.58-.26,1.41-.19,2.72.22,3.93l-12.49,12.49Z"/>
|
||||
</g>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 750 B |
@@ -1,2 +0,0 @@
|
||||
<?xml version="1.0" encoding="utf-8"?><!-- Uploaded to: SVG Repo, www.svgrepo.com, Generator: SVG Repo Mixer Tools -->
|
||||
<svg fill="#000000" width="800px" height="800px" viewBox="0 0 24 24" xmlns="http://www.w3.org/2000/svg"><path d="M5.755,20.283,4,8H20L18.245,20.283A2,2,0,0,1,16.265,22H7.735A2,2,0,0,1,5.755,20.283ZM21,4H16V3a1,1,0,0,0-1-1H9A1,1,0,0,0,8,3V4H3A1,1,0,0,0,3,6H21a1,1,0,0,0,0-2Z"/></svg>
|
||||
|
Before Width: | Height: | Size: 401 B |
@@ -1,6 +0,0 @@
|
||||
<?xml version="1.0" encoding="utf-8"?><!-- Uploaded to: SVG Repo, www.svgrepo.com, Generator: SVG Repo Mixer Tools -->
|
||||
<svg width="800px" height="800px" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M12 7L12 14M12 14L15 11M12 14L9 11" stroke="#1C274C" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
|
||||
<path d="M16 17H12H8" stroke="#1C274C" stroke-width="1.5" stroke-linecap="round"/>
|
||||
<path d="M22 12C22 16.714 22 19.0711 20.5355 20.5355C19.0711 22 16.714 22 12 22C7.28595 22 4.92893 22 3.46447 20.5355C2 19.0711 2 16.714 2 12C2 7.28595 2 4.92893 3.46447 3.46447C4.92893 2 7.28595 2 12 2C16.714 2 19.0711 2 20.5355 3.46447C21.5093 4.43821 21.8356 5.80655 21.9449 8" stroke="#1C274C" stroke-width="1.5" stroke-linecap="round"/>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 782 B |
@@ -1,24 +0,0 @@
|
||||
<?xml version="1.0" ?>
|
||||
|
||||
<!-- Uploaded to: SVG Repo, www.svgrepo.com, Generator: SVG Repo Mixer Tools -->
|
||||
<svg width="800px" height="800px" viewBox="0 0 24 24" xmlns="http://www.w3.org/2000/svg">
|
||||
|
||||
<title/>
|
||||
|
||||
<g id="Complete">
|
||||
|
||||
<g id="edit">
|
||||
|
||||
<g>
|
||||
|
||||
<path d="M20,16v4a2,2,0,0,1-2,2H4a2,2,0,0,1-2-2V6A2,2,0,0,1,4,4H8" fill="none" stroke="#000000" stroke-linecap="round" stroke-linejoin="round" stroke-width="2"/>
|
||||
|
||||
<polygon fill="none" points="12.5 15.8 22 6.2 17.8 2 8.3 11.5 8 16 12.5 15.8" stroke="#000000" stroke-linecap="round" stroke-linejoin="round" stroke-width="2"/>
|
||||
|
||||
</g>
|
||||
|
||||
</g>
|
||||
|
||||
</g>
|
||||
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 594 B |
@@ -1,4 +0,0 @@
|
||||
<?xml version="1.0" encoding="utf-8"?><!-- Uploaded to: SVG Repo, www.svgrepo.com, Generator: SVG Repo Mixer Tools -->
|
||||
<svg width="800px" height="800px" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M16 8L21 3M21 3H16M21 3V8M8 8L3 3M3 3L3 8M3 3L8 3M8 16L3 21M3 21H8M3 21L3 16M16 16L21 21M21 21V16M21 21H16" stroke="#000000" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"/>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 427 B |
@@ -1,4 +0,0 @@
|
||||
<?xml version="1.0" encoding="utf-8"?><!-- Uploaded to: SVG Repo, www.svgrepo.com, Generator: SVG Repo Mixer Tools -->
|
||||
<svg fill="#000000" width="800px" height="800px" viewBox="0 0 32 32" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M16.027 8.713c-3.333 0-6.136 2.287-6.991 5.355-0.744-1.641-2.391-2.808-4.301-2.808-2.609 0.016-4.724 2.131-4.735 4.74 0.011 2.609 2.125 4.724 4.735 4.74 1.911 0 3.552-1.167 4.301-2.813 0.855 3.073 3.657 5.36 6.991 5.36 3.312 0 6.099-2.26 6.973-5.308 0.755 1.615 2.375 2.761 4.26 2.761 2.615-0.016 4.729-2.131 4.74-4.74-0.011-2.609-2.125-4.724-4.74-4.74-1.885 0-3.505 1.147-4.265 2.761-0.869-3.048-3.656-5.308-6.968-5.308zM16.027 11.495c2.5 0 4.5 2 4.5 4.505s-2 4.505-4.5 4.505c-2.496 0.011-4.516-2.016-4.505-4.505 0-2.505 2-4.505 4.505-4.505zM4.735 14.041c1.099 0 1.959 0.86 1.959 1.959s-0.86 1.959-1.959 1.959c-1.084 0.011-1.969-0.876-1.953-1.959 0-1.099 0.859-1.959 1.953-1.959zM27.26 14.041c1.1 0 1.959 0.86 1.959 1.959s-0.859 1.959-1.959 1.959c-1.083 0.011-1.963-0.876-1.953-1.959 0-1.099 0.86-1.959 1.953-1.959z"/>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 1.0 KiB |