Add configurable upstream sync proxy and schedule settings

This commit is contained in:
yuanzhen869
2026-03-19 18:05:22 +08:00
parent 1b420cd492
commit a64725d60c
8 changed files with 480 additions and 5 deletions

View File

@@ -1,8 +1,37 @@
# Compose 会按以下优先级取值:
# 1. 当前 shell 的环境变量
# 2. 项目根目录下的 .env
# 3. docker-compose.yml 里的默认值
# 远程 MySQL 地址
MYSQL_HOST=your.mysql.host
MYSQL_PORT=3306
# 容器时区
TZ=Asia/Shanghai
# 管理账号:用于 schema / seed 装载
MYSQL_ROOT_USER=root
MYSQL_ROOT_PASSWORD=mobilemodels_root_change_me
# 业务数据库名
MYSQL_DATABASE=mobilemodels
# 只读账号:用于页面 SQL 查询和第三方联调
MYSQL_READER_USER=mobilemodels_reader
MYSQL_READER_PASSWORD=mobilemodels_reader_change_me
# 是否在容器启动或原始数据同步后自动装载 MySQL
# 远程 MySQL 场景建议保持 0
# 本地测试 MySQL 场景可设置为 1
MYSQL_AUTO_LOAD=0
# 是否启用项目内的每日自动同步
SYNC_SCHEDULE_ENABLED=0
# 每日自动同步时间,格式 HH:MM
SYNC_SCHEDULE_TIME=03:00
# GitHub 加速前缀,留空表示直连
# 例如 https://ghfast.top/
GITHUB_PROXY_PREFIX=

View File

@@ -6,7 +6,7 @@ ENV PYTHONDONTWRITEBYTECODE=1 \
WORKDIR /app
RUN apt-get update \
&& apt-get install -y --no-install-recommends git ca-certificates default-mysql-client \
&& apt-get install -y --no-install-recommends git ca-certificates default-mysql-client tzdata \
&& rm -rf /var/lib/apt/lists/*
COPY . /app

View File

@@ -14,6 +14,12 @@ docker compose up --build -d
docker compose -f docker-compose.yml -f docker-compose.test.yml up --build -d
```
如需自定义 MySQL 连接,先复制环境模板:
```bash
cp .env.example .env
```
页面入口:
- `http://127.0.0.1:8123/web/device_query.html`
@@ -37,7 +43,10 @@ web/ 页面与静态资源
- `docker-compose.yml``Dockerfile``tools/` 都位于项目主目录
- 默认主配置面向远程 MySQL
- `docker-compose.test.yml` 中的 MySQL 仅用于本地测试
- Compose 会优先读取 shell 环境变量和项目根目录 `.env`,再回退到 `docker-compose.yml` 默认值
- 上游原始 git 同步、索引构建和 MySQL 刷新都在容器内完成
- 项目内置“每日自动同步”调度,不依赖 GitHub Actions时间点可在数据管理页设置也可用 `.env` 覆盖默认值
- 如需 GitHub 加速,可配置 `GITHUB_PROXY_PREFIX`,也可在数据管理页直接修改
更多说明见:

View File

@@ -14,6 +14,12 @@ If you want a local test MySQL together with the app:
docker compose -f docker-compose.yml -f docker-compose.test.yml up --build -d
```
If you need custom MySQL settings, start by copying the env template:
```bash
cp .env.example .env
```
Entry pages:
- `http://127.0.0.1:8123/web/device_query.html`
@@ -37,7 +43,10 @@ Notes:
- `docker-compose.yml`, `Dockerfile`, and `tools/` live in the project root
- the main compose file targets remote MySQL usage
- `docker-compose.test.yml` provides a local MySQL only for testing
- Compose reads shell env vars and project-root `.env` first, then falls back to defaults in `docker-compose.yml`
- upstream git sync, index rebuild, and MySQL refresh run inside containers
- the project includes its own daily sync scheduler; you can configure the time in the Data Management page or override it via `.env`
- GitHub acceleration by URL prefix is supported through `GITHUB_PROXY_PREFIX` or the Data Management page
More details:

View File

@@ -7,6 +7,7 @@ services:
working_dir: /app
environment:
MOBILEMODELS_DATA_ROOT: /data
TZ: ${TZ:-Asia/Shanghai}
MYSQL_HOST: ${MYSQL_HOST:-host.docker.internal}
MYSQL_PORT: ${MYSQL_PORT:-3306}
MYSQL_DATABASE: ${MYSQL_DATABASE:-mobilemodels}
@@ -15,6 +16,9 @@ services:
MYSQL_READER_USER: ${MYSQL_READER_USER:-mobilemodels_reader}
MYSQL_READER_PASSWORD: ${MYSQL_READER_PASSWORD:-mobilemodels_reader_change_me}
MYSQL_AUTO_LOAD: ${MYSQL_AUTO_LOAD:-0}
SYNC_SCHEDULE_ENABLED: ${SYNC_SCHEDULE_ENABLED:-0}
SYNC_SCHEDULE_TIME: ${SYNC_SCHEDULE_TIME:-03:00}
GITHUB_PROXY_PREFIX: ${GITHUB_PROXY_PREFIX:-}
command: ["sh", "tools/container_start.sh"]
ports:
- "8123:8123"

View File

@@ -20,6 +20,12 @@ docker compose -f docker-compose.yml -f docker-compose.test.yml up --build -d
cp .env.example .env
```
Compose 的环境变量来源顺序:
1. 当前 shell 环境变量
2. 项目根目录 `.env`
3. `docker-compose.yml` 中的默认值
停止服务:
```bash
@@ -48,6 +54,7 @@ docker compose down -v
- 生成 `dist/device_index.json`
- 导出 MySQL seed 文件
- 如开启 `MYSQL_AUTO_LOAD=1`,则加载 MySQL schema 与 seed 数据
- 启动项目内置的每日自动同步调度器
- 启动 Web 页面与 API 服务
## MySQL 默认连接
@@ -59,6 +66,21 @@ docker compose down -v
如需自定义账号密码,请使用 `.env` 覆盖默认值。
常用变量:
- `MYSQL_HOST`
- `MYSQL_PORT`
- `TZ`
- `MYSQL_ROOT_USER`
- `MYSQL_ROOT_PASSWORD`
- `MYSQL_DATABASE`
- `MYSQL_READER_USER`
- `MYSQL_READER_PASSWORD`
- `MYSQL_AUTO_LOAD`
- `SYNC_SCHEDULE_ENABLED`
- `SYNC_SCHEDULE_TIME`
- `GITHUB_PROXY_PREFIX`
## MySQL 模式
- 主配置 `docker-compose.yml`
@@ -108,8 +130,18 @@ docker compose down -v
- 品牌同义词管理
- 数据来源优先级管理
- 原始数据同步
- 每日自动同步时间点设置
- 索引数据查看与重新加载
### 每日自动同步
- 调度器运行在项目容器内部,不依赖 GitHub Actions
- 页面入口:`数据管理 -> 原始数据同步`
- 可设置是否启用,以及每天执行的时间点
- 可选配置 GitHub 加速前缀,例如 `https://ghfast.top/`
- 运行期配置持久化在 `/data/state/sync_schedule.json`
- 时间按容器时区执行,默认值来自 `TZ`,默认 `Asia/Shanghai`
## 说明
- 原始数据、索引和 MySQL seed 运行时持久化在 Docker volume 中,不回写本地工作区

View File

@@ -9,7 +9,8 @@ import os
import re
import subprocess
import threading
from datetime import datetime
import time
from datetime import datetime, timedelta
from http import HTTPStatus
from http.server import SimpleHTTPRequestHandler, ThreadingHTTPServer
from pathlib import Path
@@ -24,12 +25,183 @@ MYSQL_SEED_PATH = PROJECT_ROOT / "dist/mobilemodels_mysql_seed.sql"
MYSQL_LOADER = PROJECT_ROOT / "tools/load_mysql_seed.py"
DATA_ROOT = Path(os.environ.get("MOBILEMODELS_DATA_ROOT", "/data"))
SYNC_METADATA_PATH = DATA_ROOT / "state/sync_status.json"
SCHEDULE_CONFIG_PATH = DATA_ROOT / "state/sync_schedule.json"
SYNC_LOCK = threading.Lock()
SCHEDULE_LOCK = threading.Lock()
NORMALIZE_RE = re.compile(r"[^0-9a-z\u4e00-\u9fff]+")
SCHEDULE_TIME_RE = re.compile(r"^(?:[01]?\d|2[0-3]):[0-5]\d$")
SCHEDULER_POLL_SECONDS = 20
def truthy_env(name: str, default: str = "0") -> bool:
return os.environ.get(name, default).strip().lower() in {"1", "true", "yes", "on"}
def apply_timezone_from_env() -> None:
if not os.environ.get("TZ"):
return
try:
time.tzset()
except AttributeError:
return
def mysql_auto_load_enabled() -> bool:
return os.environ.get("MYSQL_AUTO_LOAD", "0").strip().lower() in {"1", "true", "yes", "on"}
return truthy_env("MYSQL_AUTO_LOAD", "0")
def local_now() -> datetime:
return datetime.now().astimezone()
def normalize_schedule_time(value: str | None, *, fallback: str = "03:00") -> str:
text = str(value or "").strip()
if not text:
text = fallback
if not SCHEDULE_TIME_RE.match(text):
if fallback and text != fallback:
return normalize_schedule_time(fallback, fallback="")
raise RuntimeError("每日同步时间格式必须为 HH:MM例如 03:00。")
hour, minute = text.split(":", 1)
return f"{int(hour):02d}:{int(minute):02d}"
def normalize_github_proxy_prefix(value: str | None) -> str:
text = str(value or "").strip()
if not text:
return ""
if "://" not in text:
raise RuntimeError("GitHub 加速前缀必须包含协议,例如 https://ghfast.top/")
if not text.endswith("/"):
text = f"{text}/"
return text
def get_effective_repo_url(github_proxy_prefix: str | None = None) -> str:
prefix = normalize_github_proxy_prefix(
github_proxy_prefix if github_proxy_prefix is not None else os.environ.get("GITHUB_PROXY_PREFIX", "")
)
return f"{prefix}{DEFAULT_REPO_URL}" if prefix else DEFAULT_REPO_URL
def compute_next_run_at(daily_time: str, now: datetime | None = None) -> str:
current = now or local_now()
hour_text, minute_text = daily_time.split(":", 1)
candidate = current.replace(
hour=int(hour_text),
minute=int(minute_text),
second=0,
microsecond=0,
)
if candidate <= current:
candidate += timedelta(days=1)
return candidate.isoformat(timespec="seconds")
def default_schedule_config() -> dict[str, object]:
enabled = truthy_env("SYNC_SCHEDULE_ENABLED", "0")
daily_time = normalize_schedule_time(os.environ.get("SYNC_SCHEDULE_TIME", "03:00"))
timezone_name = os.environ.get("TZ", "UTC").strip() or "UTC"
github_proxy_prefix = normalize_github_proxy_prefix(os.environ.get("GITHUB_PROXY_PREFIX", ""))
return {
"enabled": enabled,
"daily_time": daily_time,
"timezone": timezone_name,
"github_proxy_prefix": github_proxy_prefix,
"next_run_at": compute_next_run_at(daily_time) if enabled else None,
"last_run_time": None,
"last_run_status": None,
"last_run_message": None,
"updated_at": None,
}
def normalize_schedule_config(raw: dict[str, object] | None) -> dict[str, object]:
config = default_schedule_config()
if isinstance(raw, dict):
if "enabled" in raw:
value = raw.get("enabled")
config["enabled"] = value if isinstance(value, bool) else str(value).strip().lower() in {"1", "true", "yes", "on"}
if "daily_time" in raw:
try:
config["daily_time"] = normalize_schedule_time(str(raw.get("daily_time") or ""))
except RuntimeError:
config["daily_time"] = normalize_schedule_time(os.environ.get("SYNC_SCHEDULE_TIME", "03:00"))
if raw.get("timezone"):
config["timezone"] = str(raw.get("timezone")).strip() or config["timezone"]
if "github_proxy_prefix" in raw:
try:
config["github_proxy_prefix"] = normalize_github_proxy_prefix(str(raw.get("github_proxy_prefix") or ""))
except RuntimeError:
config["github_proxy_prefix"] = normalize_github_proxy_prefix(os.environ.get("GITHUB_PROXY_PREFIX", ""))
for key in ("last_run_time", "last_run_status", "last_run_message", "updated_at"):
if raw.get(key) is not None:
config[key] = raw.get(key)
next_run_at = raw.get("next_run_at")
if config["enabled"] and isinstance(next_run_at, str) and next_run_at.strip():
try:
datetime.fromisoformat(next_run_at)
config["next_run_at"] = next_run_at
except ValueError:
config["next_run_at"] = compute_next_run_at(str(config["daily_time"]))
else:
config["next_run_at"] = compute_next_run_at(str(config["daily_time"])) if config["enabled"] else None
return config
def read_schedule_config() -> dict[str, object]:
with SCHEDULE_LOCK:
if not SCHEDULE_CONFIG_PATH.exists():
return normalize_schedule_config(None)
try:
payload = json.loads(SCHEDULE_CONFIG_PATH.read_text(encoding="utf-8"))
except Exception:
return normalize_schedule_config(None)
return normalize_schedule_config(payload if isinstance(payload, dict) else None)
def write_schedule_config(payload: dict[str, object]) -> dict[str, object]:
normalized = normalize_schedule_config(payload)
with SCHEDULE_LOCK:
SCHEDULE_CONFIG_PATH.parent.mkdir(parents=True, exist_ok=True)
SCHEDULE_CONFIG_PATH.write_text(
json.dumps(normalized, ensure_ascii=False, indent=2),
encoding="utf-8",
)
return normalized
def update_schedule_config(payload: dict[str, object]) -> dict[str, object]:
current = read_schedule_config()
enabled_raw = payload.get("enabled", current.get("enabled", False))
enabled = enabled_raw if isinstance(enabled_raw, bool) else str(enabled_raw).strip().lower() in {"1", "true", "yes", "on"}
daily_time = normalize_schedule_time(str(payload.get("daily_time") or current.get("daily_time") or "03:00"))
github_proxy_prefix = normalize_github_proxy_prefix(
str(payload.get("github_proxy_prefix") if "github_proxy_prefix" in payload else current.get("github_proxy_prefix") or "")
)
updated = {
**current,
"enabled": enabled,
"daily_time": daily_time,
"timezone": os.environ.get("TZ", "UTC").strip() or "UTC",
"github_proxy_prefix": github_proxy_prefix,
"next_run_at": compute_next_run_at(daily_time) if enabled else None,
"updated_at": local_now().isoformat(timespec="seconds"),
}
return write_schedule_config(updated)
def mark_schedule_run(status: str, message: str) -> dict[str, object]:
current = read_schedule_config()
updated = {
**current,
"last_run_time": local_now().isoformat(timespec="seconds"),
"last_run_status": status,
"last_run_message": message,
"next_run_at": compute_next_run_at(str(current.get("daily_time") or "03:00")) if current.get("enabled") else None,
"updated_at": local_now().isoformat(timespec="seconds"),
}
return write_schedule_config(updated)
def run_command(args: list[str]) -> subprocess.CompletedProcess[str]:
@@ -188,6 +360,9 @@ def get_status_payload() -> dict[str, object]:
mysql_ready = False
mysql_status = ""
sync_metadata = read_sync_metadata()
schedule_config = read_schedule_config()
github_proxy_prefix = str(schedule_config.get("github_proxy_prefix") or "")
effective_repo_url = get_effective_repo_url(github_proxy_prefix)
if mysql_auto_load:
mysql_proc = run_command(["python3", str(MYSQL_LOADER), "--check-only", "--wait-timeout", "5"])
if mysql_proc.returncode == 0:
@@ -206,13 +381,24 @@ def get_status_payload() -> dict[str, object]:
"data_root": str(DATA_ROOT),
"mysql_auto_load": mysql_auto_load,
"upstream_repo_url": DEFAULT_REPO_URL,
"effective_upstream_repo_url": effective_repo_url,
"upstream_branch": DEFAULT_BRANCH,
"last_sync_time": sync_metadata.get("last_sync_time"),
"last_upstream_commit": sync_metadata.get("last_upstream_commit"),
"last_sync_trigger": sync_metadata.get("last_trigger_source"),
"index_file": str(INDEX_PATH.relative_to(PROJECT_ROOT)),
"index_mtime": index_mtime,
"mysql_seed_file": str(MYSQL_SEED_PATH.relative_to(PROJECT_ROOT)),
"mysql_seed_mtime": mysql_seed_mtime,
"sync_schedule_file": str(SCHEDULE_CONFIG_PATH.relative_to(DATA_ROOT)),
"sync_schedule_enabled": schedule_config.get("enabled"),
"sync_schedule_time": schedule_config.get("daily_time"),
"sync_schedule_timezone": schedule_config.get("timezone"),
"github_proxy_prefix": github_proxy_prefix,
"sync_schedule_next_run": schedule_config.get("next_run_at"),
"sync_schedule_last_run_time": schedule_config.get("last_run_time"),
"sync_schedule_last_run_status": schedule_config.get("last_run_status"),
"sync_schedule_last_run_message": schedule_config.get("last_run_message"),
"mysql_host": mysql_host,
"mysql_port": mysql_port,
"mysql_database": mysql_database,
@@ -223,13 +409,16 @@ def get_status_payload() -> dict[str, object]:
}
def run_upstream_sync() -> dict[str, object]:
def run_upstream_sync(trigger_source: str = "manual") -> dict[str, object]:
if not SYNC_LOCK.acquire(blocking=False):
raise RuntimeError("已有同步任务在执行,请稍后再试。")
try:
schedule_config = read_schedule_config()
github_proxy_prefix = str(schedule_config.get("github_proxy_prefix") or "")
effective_repo_url = get_effective_repo_url(github_proxy_prefix)
upstream_proc = run_command(
["git", "ls-remote", DEFAULT_REPO_URL, f"refs/heads/{DEFAULT_BRANCH}"]
["git", "ls-remote", effective_repo_url, f"refs/heads/{DEFAULT_BRANCH}"]
)
upstream_commit = ""
if upstream_proc.returncode == 0 and upstream_proc.stdout.strip():
@@ -238,6 +427,7 @@ def run_upstream_sync() -> dict[str, object]:
command = [
"python3",
str(SYNC_SCRIPT),
f"--repo-url={effective_repo_url}",
"--build-index",
"--export-mysql-seed",
]
@@ -257,8 +447,11 @@ def run_upstream_sync() -> dict[str, object]:
"workspace_root": str(WORKSPACE_ROOT),
"data_root": str(DATA_ROOT),
"upstream_repo_url": DEFAULT_REPO_URL,
"effective_upstream_repo_url": effective_repo_url,
"github_proxy_prefix": github_proxy_prefix,
"upstream_branch": DEFAULT_BRANCH,
"upstream_commit": upstream_commit,
"trigger_source": trigger_source,
"last_sync_time": datetime.now().isoformat(timespec="seconds"),
"last_upstream_commit": upstream_commit,
"index_file": str(INDEX_PATH.relative_to(PROJECT_ROOT)),
@@ -274,7 +467,10 @@ def run_upstream_sync() -> dict[str, object]:
write_sync_metadata({
"last_sync_time": payload["last_sync_time"],
"last_upstream_commit": payload["last_upstream_commit"],
"last_trigger_source": trigger_source,
"upstream_repo_url": DEFAULT_REPO_URL,
"effective_upstream_repo_url": effective_repo_url,
"github_proxy_prefix": github_proxy_prefix,
"upstream_branch": DEFAULT_BRANCH,
})
return payload
@@ -282,6 +478,54 @@ def run_upstream_sync() -> dict[str, object]:
SYNC_LOCK.release()
def run_scheduled_sync_if_due() -> None:
schedule_config = read_schedule_config()
if not schedule_config.get("enabled"):
return
next_run_at = str(schedule_config.get("next_run_at") or "").strip()
if not next_run_at:
write_schedule_config({
**schedule_config,
"next_run_at": compute_next_run_at(str(schedule_config.get("daily_time") or "03:00")),
})
return
try:
next_run_dt = datetime.fromisoformat(next_run_at)
except ValueError:
write_schedule_config({
**schedule_config,
"next_run_at": compute_next_run_at(str(schedule_config.get("daily_time") or "03:00")),
})
return
if local_now() < next_run_dt:
return
try:
payload = run_upstream_sync(trigger_source="schedule")
message = str(payload.get("output") or "定时同步完成。")
mark_schedule_run("success", message)
print(f"[scheduler] upstream sync completed at {local_now().isoformat(timespec='seconds')}")
except RuntimeError as err:
status = "skipped" if "已有同步任务" in str(err) else "failed"
mark_schedule_run(status, str(err))
print(f"[scheduler] upstream sync {status}: {err}")
except Exception as err:
mark_schedule_run("failed", str(err))
print(f"[scheduler] upstream sync failed: {err}")
def scheduler_loop() -> None:
while True:
try:
run_scheduled_sync_if_due()
except Exception as err:
print(f"[scheduler] loop error: {err}")
time.sleep(SCHEDULER_POLL_SECONDS)
class MobileModelsHandler(SimpleHTTPRequestHandler):
def __init__(self, *args, **kwargs):
super().__init__(*args, directory=str(PROJECT_ROOT), **kwargs)
@@ -338,6 +582,35 @@ class MobileModelsHandler(SimpleHTTPRequestHandler):
except Exception as err:
self._send_json({"error": str(err)}, status=HTTPStatus.INTERNAL_SERVER_ERROR)
return
if self.path == "/api/sync-schedule":
try:
content_length = int(self.headers.get("Content-Length", "0") or "0")
raw_body = self.rfile.read(content_length) if content_length > 0 else b"{}"
req = json.loads(raw_body.decode("utf-8") or "{}")
if not isinstance(req, dict):
raise RuntimeError("请求体必须是 JSON 对象。")
schedule_config = update_schedule_config(req)
self._send_json(
{
"message": "同步设置已保存。",
"sync_schedule_enabled": schedule_config.get("enabled"),
"sync_schedule_time": schedule_config.get("daily_time"),
"sync_schedule_timezone": schedule_config.get("timezone"),
"github_proxy_prefix": schedule_config.get("github_proxy_prefix"),
"effective_upstream_repo_url": get_effective_repo_url(
str(schedule_config.get("github_proxy_prefix") or "")
),
"sync_schedule_next_run": schedule_config.get("next_run_at"),
"sync_schedule_last_run_time": schedule_config.get("last_run_time"),
"sync_schedule_last_run_status": schedule_config.get("last_run_status"),
"sync_schedule_last_run_message": schedule_config.get("last_run_message"),
}
)
except RuntimeError as err:
self._send_json({"error": str(err)}, status=HTTPStatus.BAD_REQUEST)
except Exception as err:
self._send_json({"error": str(err)}, status=HTTPStatus.INTERNAL_SERVER_ERROR)
return
self._send_json({"error": "Not found"}, status=HTTPStatus.NOT_FOUND)
@@ -350,7 +623,11 @@ def parse_args() -> argparse.Namespace:
def main() -> int:
apply_timezone_from_env()
write_schedule_config(read_schedule_config())
args = parse_args()
scheduler = threading.Thread(target=scheduler_loop, name="sync-scheduler", daemon=True)
scheduler.start()
server = ThreadingHTTPServer((args.host, args.port), MobileModelsHandler)
print(f"Serving MobileModels on http://{args.host}:{args.port}")
server.serve_forever()

View File

@@ -269,6 +269,35 @@
padding: 10px;
margin: 0;
}
.sync-schedule-card {
margin: 14px 0;
padding: 12px;
border: 1px solid var(--line);
border-radius: 12px;
background: #fbfcff;
}
.sync-schedule-grid {
display: grid;
grid-template-columns: minmax(220px, 280px) minmax(180px, 240px);
gap: 12px;
align-items: end;
}
.sync-schedule-grid .full-row {
grid-column: 1 / -1;
}
.check-row {
display: flex;
align-items: center;
gap: 8px;
margin: 0;
min-height: 42px;
}
.check-row input {
width: 16px;
height: 16px;
margin: 0;
flex: 0 0 auto;
}
.hidden { display: none; }
.modal-backdrop {
position: fixed;
@@ -397,6 +426,28 @@
<section id="syncTabPanel" class="manage-panel hidden">
<h3 class="title">原始数据同步</h3>
<p class="sub">从上游 `KHwang9883/MobileModels` 拉取原始 markdown 数据,并重建 `dist/device_index.json`。如已开启 MySQL 自动装载,也会同步刷新 MySQL。请先启动完整服务。</p>
<div class="sync-schedule-card">
<h4 class="title">每日自动同步</h4>
<p class="sub">在项目容器内按固定时间自动拉取上游原始数据,并重建索引与 MySQL Seed。时间按容器时区执行设置会持久化到运行期数据目录。</p>
<div class="sync-schedule-grid">
<label class="check-row">
<input id="scheduleEnabled" type="checkbox" />
<span>启用每日自动同步</span>
</label>
<div>
<label for="scheduleTimeInput">每日同步时间</label>
<input id="scheduleTimeInput" type="time" step="60" value="03:00" />
</div>
<div class="full-row">
<label for="githubProxyPrefixInput">GitHub 加速前缀</label>
<input id="githubProxyPrefixInput" type="text" placeholder="例如 https://ghfast.top/" />
</div>
</div>
<div class="btns">
<button id="saveSyncScheduleBtn" type="button" class="primary">保存同步设置</button>
</div>
<div id="scheduleStatus" class="sub">正在读取自动同步设置。</div>
</div>
<div class="btns">
<button id="syncUpstreamBtn" type="button" class="primary">开始同步原始数据</button>
<button id="refreshSyncStatusBtn" type="button">刷新同步状态</button>
@@ -470,12 +521,18 @@
const syncLogEl = document.getElementById("syncLog");
const syncUpstreamBtnEl = document.getElementById("syncUpstreamBtn");
const refreshSyncStatusBtnEl = document.getElementById("refreshSyncStatusBtn");
const scheduleEnabledEl = document.getElementById("scheduleEnabled");
const scheduleTimeInputEl = document.getElementById("scheduleTimeInput");
const githubProxyPrefixInputEl = document.getElementById("githubProxyPrefixInput");
const saveSyncScheduleBtnEl = document.getElementById("saveSyncScheduleBtn");
const scheduleStatusEl = document.getElementById("scheduleStatus");
const reloadIndexBtnEl = document.getElementById("reloadIndexBtn");
const indexStatusEl = document.getElementById("indexStatus");
const indexSummaryEl = document.getElementById("indexSummary");
let syncSupported = false;
let syncRunning = false;
let scheduleSaving = false;
function normalizeText(text) {
return (text || "").toLowerCase().replace(/[^0-9a-z\u4e00-\u9fff]+/g, "");
@@ -502,6 +559,7 @@
function updateSyncButtons() {
syncUpstreamBtnEl.disabled = syncRunning || !syncSupported;
refreshSyncStatusBtnEl.disabled = syncRunning;
saveSyncScheduleBtnEl.disabled = syncRunning || scheduleSaving;
}
function renderIndexStatus(message, details) {
@@ -522,6 +580,8 @@
if (data.workspace_root) lines.push(`工作空间目录: ${data.workspace_root}`);
if (data.storage_mode) lines.push(`存储模式: ${data.storage_mode}`);
if (data.upstream_repo_url) lines.push(`上游仓库: ${data.upstream_repo_url}`);
if (data.github_proxy_prefix) lines.push(`GitHub 加速前缀: ${data.github_proxy_prefix}`);
if (data.effective_upstream_repo_url) lines.push(`实际同步地址: ${data.effective_upstream_repo_url}`);
if (data.upstream_branch) lines.push(`上游分支: ${data.upstream_branch}`);
if (data.last_sync_time) lines.push(`最近同步时间: ${data.last_sync_time}`);
if (data.last_upstream_commit) lines.push(`最近同步提交: ${data.last_upstream_commit}`);
@@ -543,8 +603,33 @@
syncLogEl.textContent = lines.join("\n").trim() || "暂无同步记录";
}
function renderScheduleStatus(data, options = {}) {
const preserveMessage = !!options.preserveMessage;
const enabled = !!(data && data.sync_schedule_enabled);
const dailyTime = (data && data.sync_schedule_time) || "03:00";
const githubProxyPrefix = (data && data.github_proxy_prefix) || "";
scheduleEnabledEl.checked = enabled;
scheduleTimeInputEl.value = dailyTime;
githubProxyPrefixInputEl.value = githubProxyPrefix;
if (preserveMessage) return;
const lines = [
`每日自动同步: ${enabled ? "已启用" : "未启用"}`,
`同步时间: ${dailyTime}`,
];
if (data && data.sync_schedule_timezone) lines.push(`容器时区: ${data.sync_schedule_timezone}`);
if (githubProxyPrefix) lines.push(`GitHub 加速前缀: ${githubProxyPrefix}`);
if (data && data.effective_upstream_repo_url) lines.push(`实际同步地址: ${data.effective_upstream_repo_url}`);
if (data && data.sync_schedule_next_run) lines.push(`下次执行: ${data.sync_schedule_next_run}`);
if (data && data.sync_schedule_last_run_time) lines.push(`最近自动执行: ${data.sync_schedule_last_run_time}`);
if (data && data.sync_schedule_last_run_status) lines.push(`最近执行结果: ${data.sync_schedule_last_run_status}`);
if (data && data.sync_schedule_last_run_message) lines.push(`结果详情: ${data.sync_schedule_last_run_message}`);
scheduleStatusEl.textContent = lines.join("");
}
async function loadSyncStatus(options = {}) {
const preserveLog = !!options.preserveLog;
const preserveScheduleMessage = !!options.preserveScheduleMessage;
syncStatusEl.textContent = "正在检测同步能力。";
try {
const data = await fetchJson("/api/status", { cache: "no-store" });
@@ -552,12 +637,16 @@
syncStatusEl.textContent = syncSupported
? "已连接 Docker Compose 服务,可以直接从页面同步原始数据、索引和 MySQL。"
: "当前服务不支持原始数据同步。";
renderScheduleStatus(data, { preserveMessage: preserveScheduleMessage });
if (!preserveLog) {
renderSyncLog(data, "服务状态");
}
} catch (err) {
syncSupported = false;
syncStatusEl.textContent = `当前页面未连接支持同步的 Docker Compose 服务:${err.message}`;
if (!preserveScheduleMessage) {
scheduleStatusEl.textContent = `自动同步设置读取失败: ${err.message}`;
}
if (!preserveLog) {
syncLogEl.textContent = "请使用 `docker compose up --build -d` 启动完整服务后,再使用这个功能。";
}
@@ -593,6 +682,31 @@
}
}
async function saveSyncSchedule() {
scheduleSaving = true;
updateSyncButtons();
scheduleStatusEl.textContent = "正在保存每日自动同步设置...";
try {
const payload = await fetchJson("/api/sync-schedule", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
enabled: !!scheduleEnabledEl.checked,
daily_time: scheduleTimeInputEl.value || "03:00",
github_proxy_prefix: githubProxyPrefixInputEl.value || "",
}),
});
scheduleStatusEl.textContent = payload.message || "每日自动同步设置已保存。";
await loadSyncStatus({ preserveLog: true, preserveScheduleMessage: true });
renderScheduleStatus(payload);
} catch (err) {
scheduleStatusEl.textContent = `保存失败: ${err.message}`;
} finally {
scheduleSaving = false;
updateSyncButtons();
}
}
function normalizeAliasList(name, aliases) {
const out = [];
const seen = new Set();
@@ -1235,6 +1349,7 @@
manufacturerCountBtnEl.addEventListener("click", openManufacturerListModal);
syncUpstreamBtnEl.addEventListener("click", runUpstreamSync);
refreshSyncStatusBtnEl.addEventListener("click", loadSyncStatus);
saveSyncScheduleBtnEl.addEventListener("click", saveSyncSchedule);
reloadIndexBtnEl.addEventListener("click", loadIndexFromPath);
brandModalCancelBtnEl.addEventListener("click", closeBrandModal);