Compare commits
10 Commits
f35dcc4a18
...
0cb08642aa
| Author | SHA1 | Date | |
|---|---|---|---|
| 0cb08642aa | |||
| d3d1a8650e | |||
| dfddbb5ea0 | |||
| ac9720e7de | |||
| a64725d60c | |||
| 1b420cd492 | |||
| 74e50a2b30 | |||
| f12b3d5ecd | |||
| 3c0e5ed49c | |||
| f6ba48a0d0 |
@@ -0,0 +1,7 @@
|
|||||||
|
.git
|
||||||
|
.DS_Store
|
||||||
|
.env
|
||||||
|
__pycache__
|
||||||
|
*.pyc
|
||||||
|
*.pyo
|
||||||
|
*.swp
|
||||||
@@ -0,0 +1,37 @@
|
|||||||
|
# Compose 会按以下优先级取值:
|
||||||
|
# 1. 当前 shell 的环境变量
|
||||||
|
# 2. 项目根目录下的 .env
|
||||||
|
# 3. docker-compose.yml 里的默认值
|
||||||
|
|
||||||
|
# 远程 MySQL 地址
|
||||||
|
MYSQL_HOST=your.mysql.host
|
||||||
|
MYSQL_PORT=3306
|
||||||
|
|
||||||
|
# 容器时区
|
||||||
|
TZ=Asia/Shanghai
|
||||||
|
|
||||||
|
# 管理账号:用于 schema / seed 装载
|
||||||
|
MYSQL_ROOT_USER=root
|
||||||
|
MYSQL_ROOT_PASSWORD=mobilemodels_root_change_me
|
||||||
|
|
||||||
|
# 业务数据库名
|
||||||
|
MYSQL_DATABASE=mobilemodels
|
||||||
|
|
||||||
|
# 只读账号:用于页面 SQL 查询和第三方联调
|
||||||
|
MYSQL_READER_USER=mobilemodels_reader
|
||||||
|
MYSQL_READER_PASSWORD=mobilemodels_reader_change_me
|
||||||
|
|
||||||
|
# 是否在容器启动或原始数据同步后自动装载 MySQL
|
||||||
|
# 远程 MySQL 场景建议保持 0
|
||||||
|
# 本地测试 MySQL 场景可设置为 1
|
||||||
|
MYSQL_AUTO_LOAD=0
|
||||||
|
|
||||||
|
# 是否启用项目内的每日自动同步
|
||||||
|
SYNC_SCHEDULE_ENABLED=0
|
||||||
|
|
||||||
|
# 每日自动同步时间,格式 HH:MM
|
||||||
|
SYNC_SCHEDULE_TIME=03:00
|
||||||
|
|
||||||
|
# GitHub 加速前缀,留空表示直连
|
||||||
|
# 例如 https://ghfast.top/
|
||||||
|
GITHUB_PROXY_PREFIX=
|
||||||
@@ -0,0 +1,2 @@
|
|||||||
|
.DS_Store
|
||||||
|
.env
|
||||||
+16
@@ -0,0 +1,16 @@
|
|||||||
|
FROM python:3.12-slim
|
||||||
|
|
||||||
|
ENV PYTHONDONTWRITEBYTECODE=1 \
|
||||||
|
PYTHONUNBUFFERED=1
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
RUN apt-get update \
|
||||||
|
&& apt-get install -y --no-install-recommends git ca-certificates default-mysql-client tzdata \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
COPY . /app
|
||||||
|
|
||||||
|
EXPOSE 8123
|
||||||
|
|
||||||
|
CMD ["sh", "tools/container_start.sh"]
|
||||||
@@ -1,132 +1,55 @@
|
|||||||
# 手机品牌型号汇总
|
# 手机品牌型号汇总
|
||||||
|
|
||||||
[](https://github.com/KHwang9883/MobileModels/issues)
|
当前项目以根目录作为统一入口,支持通过 Docker Compose 直接启动设备查询、数据管理和 MySQL 服务。
|
||||||
[](https://github.com/KHwang9883/MobileModels/pulls)
|
|
||||||
[](https://github.com/KHwang9883/MobileModels)
|
|
||||||
[](https://github.com/KHwang9883/MobileModels)
|
|
||||||
[](https://creativecommons.org/licenses/by-nc-sa/4.0/)
|
|
||||||
|
|
||||||
汇总各厂商上市的手机型号与对应的传播名。
|
## 启动方式
|
||||||
|
|
||||||
[English](README_en.md)
|
```bash
|
||||||
|
docker compose up --build -d
|
||||||
|
```
|
||||||
|
|
||||||
- ✅ 包含
|
如需本地测试 MySQL,一起叠加测试配置启动:
|
||||||
- ⏹ 仅部分包含
|
|
||||||
- ❌ 不包含
|
|
||||||
|
|
||||||
| 名称 | 品牌 | 汇总范围 | codename | 海外机型 | 备注 |
|
```bash
|
||||||
| :-: | :-: | :-: | :-: | :-: | :-: |
|
docker compose -f docker-compose.yml -f docker-compose.test.yml up --build -d
|
||||||
| [360shouji](brands/360shouji.md) | 360 手机 | 全部 360/奇酷手机 | ❌ | ❌ | -- |
|
```
|
||||||
| [apple_all](brands/apple_all.md) | Apple | 全部 iPhone/iPad/iPod touch/Apple Watch/Apple TV/Apple Vision | ✅ | ✅ | -- |
|
|
||||||
| [apple_all_en](brands/apple_all_en.md) | Apple | 全部 iPhone/iPad/iPod touch/Apple Watch/Apple TV/Apple Vision | ✅ | ✅ | 英文版 |
|
|
||||||
| [apple_cn](brands/apple_cn.md) | Apple | 全部国行 iPhone/iPad/iPod touch/Apple Watch/Apple Vision | ✅ | ❌ | -- |
|
|
||||||
| [asus_cn](brands/asus_cn.md) | 华硕 (ASUS) | ROG Phone 等 | ✅ | ❌ | -- |
|
|
||||||
| [asus_en](brands/asus_en.md) | 华硕 (ASUS) | ROG Phone/Zenfone | ✅ | ✅ | 英文版 |
|
|
||||||
| [blackshark](brands/blackshark.md) | 黑鲨 (Black Shark) | 全部机型 | ✅ | ✅ | -- |
|
|
||||||
| [blackshark_en](brands/blackshark_en.md) | 黑鲨 (Black Shark) | 全部机型 | ✅ | ✅ | 英文版 |
|
|
||||||
| [coolpad](brands/coolpad.md) | 酷派 (Coolpad) | 酷派近年智能手机机型 | ❌ | ❌ | -- |
|
|
||||||
| [google](brands/google.md) | Google | Google Pixel 手机/平板/手表 | ✅ | ✅ | 英文版 |
|
|
||||||
| [honor_cn](brands/honor_cn.md) | 荣耀 (HONOR) | 荣耀手机/平板/笔记本电脑/智慧屏/穿戴设备,仅包含国行型号 | ⏹ | ❌ | -- |
|
|
||||||
| [honor_global_en](brands/honor_global_en.md)| 荣耀 (HONOR) | 荣耀手机/平板,仅包含海外型号 | ⏹ | ✅ | 英文版 |
|
|
||||||
| [huawei_cn](brands/huawei_cn.md) | 华为 (HUAWEI) | 华为 Mate/Pura/nova/G/麦芒/畅享系列、平板电脑、MateBook、智慧屏及穿戴设备,仅包含国行型号 | ⏹ | ❌ | [其他早期型号参阅此处](misc/early-huawei-models.md) |
|
|
||||||
| [huawei_global_en](brands/huawei_global_en.md)| 华为 (HUAWEI) | 华为 Mate/Pura/nova/Y 系列及平板电脑,仅包含海外型号 | ⏹ | ⏹ | 英文版 |
|
|
||||||
| [lenovo_cn](brands/lenovo_cn.md) | 联想 (Lenovo) | 联想品牌 2017 年起上市的机型、ZUK 全部机型 | ✅ | ❌ | -- |
|
|
||||||
| [letv](brands/letv.md) | 乐视 (Letv) | 全部手机机型 | ❌ | ❌ | 不包含电视产品 |
|
|
||||||
| [meizu](brands/meizu.md) | 魅族 (MEIZU) | 全部机型 | ✅ | ✅ | -- |
|
|
||||||
| [meizu_en](brands/meizu_en.md) | 魅族 (MEIZU) | 全部机型 | ✅ | ✅ | 英文版 |
|
|
||||||
| [mitv_cn](brands/mitv_cn.md) | 小米 (Xiaomi) | 全部国行小米/Redmi 电视、机顶盒 | ❌ | ❌ | -- |
|
|
||||||
| [mitv_global_en](brands/mitv_global_en.md) | 小米 (Xiaomi) | 全部小米/Redmi 电视、机顶盒、智能电视棒,仅包含海外型号 | ❌ | ✅ | 英文版 |
|
|
||||||
| [motorola_cn](brands/motorola_cn.md) | 摩托罗拉 (Motorola) | 2015 年起上市的机型 | ✅ | ❌ | -- |
|
|
||||||
| [nokia_cn](brands/nokia_cn.md) | 诺基亚 (Nokia) | 2017 年起由 HMD Global 制造的智能手机机型 | ✅ | ❌ | -- |
|
|
||||||
| [nothing](brands/nothing.md) | Nothing | 全部机型 | ✅ | ✅ | 英文版 |
|
|
||||||
| [nubia](brands/nubia.md) | 努比亚 (nubia) | 全部机型 | ❌ | ⏹ | -- |
|
|
||||||
| [oneplus](brands/oneplus.md) | 一加 (OnePlus) | 全部机型 | ✅ | ✅ | -- |
|
|
||||||
| [oneplus_en](brands/oneplus_en.md) | 一加 (OnePlus) | 全部机型 | ✅ | ✅ | 英文版 |
|
|
||||||
| [oppo_cn](brands/oppo_cn.md) | OPPO | 2018 年起新型号命名方式的国行机型 | ⏹ | ❌ | -- |
|
|
||||||
| [oppo_global_en](brands/oppo_global_en.md) | OPPO | 2018 年起上市的海外机型 | ⏹ | ⏹ | 英文版 |
|
|
||||||
| [realme_cn](brands/realme_cn.md) | 真我 (realme) | 全部国行机型 | ⏹ | ❌ | -- |
|
|
||||||
| [realme_global_en](brands/realme_global_en.md) | 真我 (realme) | 全部海外机型 | ⏹ | ✅ | 英文版 |
|
|
||||||
| [samsung_cn](brands/samsung_cn.md) | 三星 (Samsung) | Galaxy S/Note/A/Z/M/C/J/On/Tab/心系天下系列及个别其他机型,仅包含国行型号 | ✅ | ❌ | [其他早期型号参阅此处](misc/early-samsung-models.md) |
|
|
||||||
| [samsung_global_en](brands/samsung_global_en.md) | 三星 (Samsung) | Galaxy S/Note/A/Z/M/F 系列,2019 年起上市的机型 | ✅ | ⏹ | 英文版 |
|
|
||||||
| [smartisan](brands/smartisan.md) | 坚果 (Smartisan) | 全部机型 | ✅ | ❌ | -- |
|
|
||||||
| [sony](brands/sony.md) | 索尼 (SONY) | 2015 年起上市的机型 | ✅ | ✅ | 英文版 |
|
|
||||||
| [sony_cn](brands/sony_cn.md) | 索尼 (SONY) | 2015 年起上市的国行机型 | ✅ | ❌ | -- |
|
|
||||||
| [vivo_cn](brands/vivo_cn.md) | vivo | 2018 年起新型号命名方式的国行机型 | ✅ | ❌ | -- |
|
|
||||||
| [vivo_global_en](brands/vivo_global_en.md) | vivo | 2019 年起上市的海外机型 | ⏹ | ⏹ | 英文版 |
|
|
||||||
| [xiaomi](brands/xiaomi.md) | 小米 (Xiaomi) | 小米/REDMI/POCO 手机 & 平板等 | ✅ | ✅ | -- |
|
|
||||||
| [xiaomi_cn](brands/xiaomi_cn.md) | 小米 (Xiaomi) | 小米/REDMI 手机 & 平板等 | ✅ | ✅ | 英文版 |
|
|
||||||
| [xiaomi_en](brands/xiaomi_en.md) | 小米 (Xiaomi) | 小米/REDMI/POCO 手机 & 平板等 | ✅ | ✅ | 英文版 |
|
|
||||||
| [xiaomi-wear](brands/xiaomi-wear.md) | 小米 (Xiaomi) | 小米/Redmi 手表、手环、TWS 等穿戴设备 | ⏹ | ✅ | TWS 不包含外包型号;暂不含儿童手表型号 |
|
|
||||||
| [zhixuan](brands/zhixuan.md) | 华为智选 | U-MAGIC 优畅享/电信麦芒/NZONE/Hi nova/雷鸟 FFALCON/TD Tech/WIKO | ⏹ | ❌ | -- |
|
|
||||||
| [zte_cn](brands/zte_cn.md) | 中兴 (ZTE) | 2017 年起上市的机型 | ❌ | ❌ | -- |
|
|
||||||
|
|
||||||
## 更新日志
|
如需自定义 MySQL 连接,先复制环境模板:
|
||||||
|
|
||||||
参见 [CHANGELOG.md](CHANGELOG.md)
|
```bash
|
||||||
|
cp .env.example .env
|
||||||
|
```
|
||||||
|
|
||||||
## 许可
|
页面入口:
|
||||||
|
|
||||||
<a rel="license" href="https://creativecommons.org/licenses/by-nc-sa/4.0/"><img alt="知识共享许可协议" style="border-width:0" src="https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png" /></a><br />本作品采用 <a rel="license" href="https://creativecommons.org/licenses/by-nc-sa/4.0/">知识共享署名-非商业性使用-相同方式共享 4.0 国际许可协议</a> 进行许可。
|
- `http://127.0.0.1:8123/web/device_query.html`
|
||||||
|
- `http://127.0.0.1:8123/web/brand_management.html`
|
||||||
|
- `http://127.0.0.1:8123/web/device_query.html?view=docs`
|
||||||
|
|
||||||
## 项目历史
|
## 目录结构
|
||||||
|
|
||||||
### 2024 年 3 月
|
```text
|
||||||
- 将本项目 csv 及脚本迁移至 [此 repo](https://github.com/KHwang9883/MobileModels-csv),使用 GitHub Actions 自动更新。
|
workspace/ 上游原始数据、补充资料与历史文件
|
||||||
|
dist/ 构建产物与 MySQL seed
|
||||||
|
docs/ 项目文档
|
||||||
|
sql/ MySQL schema
|
||||||
|
tools/ 构建、同步、导入与服务脚本
|
||||||
|
web/ 页面与静态资源
|
||||||
|
```
|
||||||
|
|
||||||
### 2022 年 4 月
|
## 说明
|
||||||
- 新增 [各大 Android 厂商 BL 解锁/内核开源情况](https://github.com/KHwang9883/bootloader-kernel-source) 汇总(已停更)。
|
|
||||||
|
|
||||||
### 2021 年 12 月
|
- `workspace/` 用于存放原始数据工作区
|
||||||
- 新增 [各品牌型号命名规则](misc/naming-rules.md) 汇总。
|
- `docker-compose.yml`、`Dockerfile`、`tools/` 都位于项目主目录
|
||||||
|
- 默认主配置面向远程 MySQL
|
||||||
|
- `docker-compose.test.yml` 中的 MySQL 仅用于本地测试
|
||||||
|
- 容器内生成的 `dist/device_index.json` 与 `dist/mobilemodels_mysql_seed.sql` 会直接挂载到宿主机项目根目录的 `dist/`
|
||||||
|
- Compose 会优先读取 shell 环境变量和项目根目录 `.env`,再回退到 `docker-compose.yml` 默认值
|
||||||
|
- 上游原始 git 同步、索引构建和 MySQL 刷新都在容器内完成
|
||||||
|
- 项目内置“每日自动同步”调度,不依赖 GitHub Actions;时间点可在数据管理页设置,也可用 `.env` 覆盖默认值
|
||||||
|
- 如需 GitHub 加速,可配置 `GITHUB_PROXY_PREFIX`,也可在数据管理页直接修改
|
||||||
|
|
||||||
### 2019 年 7 月
|
更多说明见:
|
||||||
- 文档版停止维护。
|
|
||||||
|
|
||||||
### 2019 年 4 月
|
- [docs/README.md](docs/README.md)
|
||||||
- 文档版迁移至 GitHub 直链下载。
|
- [docs/web-ui.md](docs/web-ui.md)
|
||||||
- 新增英文版。
|
|
||||||
|
|
||||||
### 2019 年 3 月
|
|
||||||
- 文档版迁移至微云。
|
|
||||||
|
|
||||||
### 2018 年 11 月
|
|
||||||
- 项目同步至 GitHub。
|
|
||||||
|
|
||||||
### 2018 年 7 月
|
|
||||||
- 由于小米社区帖子失效,「手机品牌型号汇总」项目公开发布,提供文档版百度网盘下载。
|
|
||||||
|
|
||||||
### 2016 年 3 月
|
|
||||||
- 小米手机型号汇总发布至 [小米社区](http://bbs.xiaomi.cn/t-12641411)(帖子已失效)。
|
|
||||||
|
|
||||||
### 2016 年 2 月
|
|
||||||
- 我开始汇总一些国内手机品牌的型号,「手机品牌型号汇总」的雏形诞生。
|
|
||||||
|
|
||||||
[](https://starchart.cc/KHwang9883/MobileModels)
|
|
||||||
|
|
||||||
## 参考资料
|
|
||||||
|
|
||||||
- [电信设备终端网](http://zd.taf.org.cn)
|
|
||||||
- [产品认证证书查询](http://webdata.cqccms.com.cn/webdata/query/CCCCerti.do)
|
|
||||||
- [工业和信息化部政务服务平台](https://ythzxfw.miit.gov.cn/resultQuery)
|
|
||||||
- [产品库-中国电信天翼终端信息平台](http://surfing.tydevice.com/)
|
|
||||||
- [Google Play 支持的设备](http://storage.googleapis.com/play_public/supported_devices.html)
|
|
||||||
- [Wi-Fi Alliance](https://www.wi-fi.org)
|
|
||||||
- [Bluetooth Launch Studio](https://launchstudio.bluetooth.com/Listings/Search)
|
|
||||||
- [Xiaomi Firmware Updater](https://xiaomifirmwareupdater.com/)
|
|
||||||
- [Huawei Open Source Release Center](https://consumer.huawei.com/en/opensource/)
|
|
||||||
- [ReaMEIZU](https://reameizu.com/)
|
|
||||||
- [The Apple Wiki](https://theapplewiki.com/)
|
|
||||||
- [ipsw.me](https://ipsw.me)
|
|
||||||
- [XDA Developers](https://www.xda-developers.com)
|
|
||||||
- [Huawei Firmware Database](https://pro-teammt.ru/en/online-firmware-database-ru/)
|
|
||||||
- [XSMS IMEI 数据库](http://xsms.com.ua/phone/imei/all/1)
|
|
||||||
- [Android Dumps](https://dumps.tadiphone.dev/dumps)
|
|
||||||
- [Lenovo Android タブレット一覧](https://idomizu.dev/archives/20150)
|
|
||||||
|
|
||||||
以及各品牌官网、论坛、微博等,恕不一一列出
|
|
||||||
|
|
||||||
## 联系方式
|
|
||||||
|
|
||||||
如有相关问题,请 [提交 Issue](https://github.com/KHwang9883/MobileModels/issues)。如有错误或缺漏,欢迎提交 PR。
|
|
||||||
|
|
||||||
其他平台同名(@KHwang9883),但不一定回复本 repo 相关问题。
|
|
||||||
|
|||||||
+44
-49
@@ -1,60 +1,55 @@
|
|||||||
# Mobile Models
|
# MobileModels
|
||||||
|
|
||||||
[](https://github.com/KHwang9883/MobileModels/issues)
|
The project now uses the repository root as the single runtime entry and can be started directly with Docker Compose.
|
||||||
[](https://github.com/KHwang9883/MobileModels/pulls)
|
|
||||||
[](https://github.com/KHwang9883/MobileModels)
|
|
||||||
[](https://github.com/KHwang9883/MobileModels)
|
|
||||||
[](https://creativecommons.org/licenses/by-nc-sa/4.0/)
|
|
||||||
|
|
||||||
Collecting device names, models and internal codenames.
|
## Run
|
||||||
|
|
||||||
[Issue submission](https://github.com/KHwang9883/MobileModels/issues) and [Pull Requests](https://github.com/KHwang9883/MobileModels/pulls) are welcomed if you find mistakes.
|
```bash
|
||||||
|
docker compose up --build -d
|
||||||
|
```
|
||||||
|
|
||||||
Unlisted brands usually not include international models.
|
If you want a local test MySQL together with the app:
|
||||||
|
|
||||||
| Name | Brand | Range |
|
```bash
|
||||||
| :-: | :-: | :-: |
|
docker compose -f docker-compose.yml -f docker-compose.test.yml up --build -d
|
||||||
| [apple_all_en](brands/apple_all_en.md) | Apple | iPhone, iPad, iPod touch, Apple Watch, Apple TV and Apple Vision |
|
```
|
||||||
| [asus_en](brands/asus_en.md) | ASUS | ROG Phone, Zenfone |
|
|
||||||
| [blackshark_en](brands/blackshark_en.md) | Black Shark | All models |
|
|
||||||
| [google](brands/google.md) | Google | Google Pixel phones, tablets & watch |
|
|
||||||
| [honor_global_en](brands/honor_global_en.md) | HONOR | All international models |
|
|
||||||
| [huawei_global_en](brands/huawei_global_en.md) | HUAWEI | HUAWEI Mate, Pura, nova & Y series, MediaPad & MatePad series |
|
|
||||||
| [meizu_en](brands/meizu_en.md) | Meizu | All models |
|
|
||||||
| [mitv_global_en](brands/mitv_global_en.md) | Xiaomi | All international/Indian Xiaomi & Redmi TV models (excluding Chinese models) |
|
|
||||||
| [nothing](brands/nothing.md) | Nothing | All models |
|
|
||||||
| [oneplus_en](brands/oneplus_en.md) | OnePlus | All models |
|
|
||||||
| [oppo_global_en](brands/oppo_global_en.md) | OPPO | International models since 2018 |
|
|
||||||
| [samsung_global_en](brands/samsung_global_en.md) | Samsung | International models since 2019 |
|
|
||||||
| [sony](brands/sony.md) | Sony | All models since 2015 |
|
|
||||||
| [realme_global_en](brands/realme_global_en.md) | realme | All international models |
|
|
||||||
| [vivo_global_en](brands/vivo_global_en.md) | vivo | International models since 2019 |
|
|
||||||
| [xiaomi_en](xiaomi_en.md) | Xiaomi | Xiaomi/Redmi/POCO phones & tablets |
|
|
||||||
|
|
||||||
## Changelog
|
If you need custom MySQL settings, start by copying the env template:
|
||||||
|
|
||||||
[CHANGELOG_en.md](CHANGELOG_en.md)
|
```bash
|
||||||
|
cp .env.example .env
|
||||||
|
```
|
||||||
|
|
||||||
## References
|
Entry pages:
|
||||||
|
|
||||||
- [TENAA](http://zd.taf.org.cn)
|
- `http://127.0.0.1:8123/web/device_query.html`
|
||||||
- [CQCCMS](http://webdata.cqccms.com.cn/webdata/query/CCCCerti.do)
|
- `http://127.0.0.1:8123/web/brand_management.html`
|
||||||
- [MIIT](https://ythzxfw.miit.gov.cn/resultQuery)
|
- `http://127.0.0.1:8123/web/device_query.html?view=docs`
|
||||||
- [China Telecom Tianyi Devices](http://surfing.tydevice.com/)
|
|
||||||
- [Google Play Supported Devices](http://storage.googleapis.com/play_public/supported_devices.html)
|
|
||||||
- [Wi-Fi Alliance](https://www.wi-fi.org)
|
|
||||||
- [Bluetooth Launch Studio](https://launchstudio.bluetooth.com/Listings/Search)
|
|
||||||
- [Xiaomi Firmware Updater](https://xiaomifirmwareupdater.com/)
|
|
||||||
- [Huawei Open Source Release Center](https://consumer.huawei.com/en/opensource/)
|
|
||||||
- [ReaMEIZU](https://reameizu.com/)
|
|
||||||
- [The Apple Wiki](https://theapplewiki.com/)
|
|
||||||
- [ipsw.me](https://ipsw.me)
|
|
||||||
- [XDA Developers](https://www.xda-developers.com)
|
|
||||||
- [Huawei Firmware Database](https://pro-teammt.ru/en/online-firmware-database-ru/)
|
|
||||||
- [XSMS IMEI Database](http://xsms.com.ua/phone/imei/all/1)
|
|
||||||
- [Android Dumps](https://dumps.tadiphone.dev/dumps)
|
|
||||||
- [Lenovo Android タブレット一覧](https://idomizu.dev/archives/20150)
|
|
||||||
|
|
||||||
## License
|
## Structure
|
||||||
|
|
||||||
<a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png" /></a><br />This work is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>.
|
```text
|
||||||
|
workspace/ upstream raw data, notes, and history files
|
||||||
|
dist/ build outputs and MySQL seed
|
||||||
|
docs/ project docs
|
||||||
|
sql/ MySQL schema
|
||||||
|
tools/ build, sync, import, and service scripts
|
||||||
|
web/ UI pages and static assets
|
||||||
|
```
|
||||||
|
|
||||||
|
Notes:
|
||||||
|
|
||||||
|
- `workspace/` stores the source workspace
|
||||||
|
- `docker-compose.yml`, `Dockerfile`, and `tools/` live in the project root
|
||||||
|
- the main compose file targets remote MySQL usage
|
||||||
|
- `docker-compose.test.yml` provides a local MySQL only for testing
|
||||||
|
- generated `dist/device_index.json` and `dist/mobilemodels_mysql_seed.sql` are bind-mounted back to the host project's `dist/` directory
|
||||||
|
- Compose reads shell env vars and project-root `.env` first, then falls back to defaults in `docker-compose.yml`
|
||||||
|
- upstream git sync, index rebuild, and MySQL refresh run inside containers
|
||||||
|
- the project includes its own daily sync scheduler; you can configure the time in the Data Management page or override it via `.env`
|
||||||
|
- GitHub acceleration by URL prefix is supported through `GITHUB_PROXY_PREFIX` or the Data Management page
|
||||||
|
|
||||||
|
More details:
|
||||||
|
|
||||||
|
- [docs/README.md](docs/README.md)
|
||||||
|
- [docs/web-ui.md](docs/web-ui.md)
|
||||||
|
|||||||
Vendored
+194261
File diff suppressed because it is too large
Load Diff
Vendored
+33113
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,35 @@
|
|||||||
|
services:
|
||||||
|
mysql:
|
||||||
|
image: mysql:8.4
|
||||||
|
container_name: mobilemodels-mysql
|
||||||
|
command:
|
||||||
|
- --character-set-server=utf8mb4
|
||||||
|
- --collation-server=utf8mb4_0900_ai_ci
|
||||||
|
environment:
|
||||||
|
MYSQL_ROOT_PASSWORD: ${MYSQL_ROOT_PASSWORD:-mobilemodels_root}
|
||||||
|
MYSQL_DATABASE: ${MYSQL_DATABASE:-mobilemodels}
|
||||||
|
ports:
|
||||||
|
- "3306:3306"
|
||||||
|
volumes:
|
||||||
|
- mobilemodels_mysql_data:/var/lib/mysql
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD-SHELL", "mysqladmin ping -h127.0.0.1 -uroot -p$$MYSQL_ROOT_PASSWORD --silent"]
|
||||||
|
interval: 5s
|
||||||
|
timeout: 5s
|
||||||
|
retries: 30
|
||||||
|
start_period: 20s
|
||||||
|
restart: unless-stopped
|
||||||
|
init: true
|
||||||
|
|
||||||
|
mobilemodels:
|
||||||
|
environment:
|
||||||
|
MYSQL_HOST: mysql
|
||||||
|
MYSQL_PORT: 3306
|
||||||
|
MYSQL_ROOT_USER: root
|
||||||
|
MYSQL_AUTO_LOAD: 1
|
||||||
|
depends_on:
|
||||||
|
mysql:
|
||||||
|
condition: service_healthy
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
mobilemodels_mysql_data:
|
||||||
@@ -0,0 +1,34 @@
|
|||||||
|
services:
|
||||||
|
mobilemodels:
|
||||||
|
build:
|
||||||
|
context: .
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
container_name: mobilemodels-web
|
||||||
|
working_dir: /app
|
||||||
|
environment:
|
||||||
|
MOBILEMODELS_DATA_ROOT: /data
|
||||||
|
TZ: ${TZ:-Asia/Shanghai}
|
||||||
|
MYSQL_HOST: ${MYSQL_HOST:-host.docker.internal}
|
||||||
|
MYSQL_PORT: ${MYSQL_PORT:-3306}
|
||||||
|
MYSQL_DATABASE: ${MYSQL_DATABASE:-mobilemodels}
|
||||||
|
MYSQL_ROOT_USER: ${MYSQL_ROOT_USER:-root}
|
||||||
|
MYSQL_ROOT_PASSWORD: ${MYSQL_ROOT_PASSWORD:-mobilemodels_root}
|
||||||
|
MYSQL_READER_USER: ${MYSQL_READER_USER:-mobilemodels_reader}
|
||||||
|
MYSQL_READER_PASSWORD: ${MYSQL_READER_PASSWORD:-mobilemodels_reader_change_me}
|
||||||
|
MYSQL_AUTO_LOAD: ${MYSQL_AUTO_LOAD:-0}
|
||||||
|
SYNC_SCHEDULE_ENABLED: ${SYNC_SCHEDULE_ENABLED:-0}
|
||||||
|
SYNC_SCHEDULE_TIME: ${SYNC_SCHEDULE_TIME:-03:00}
|
||||||
|
GITHUB_PROXY_PREFIX: ${GITHUB_PROXY_PREFIX:-}
|
||||||
|
command: ["sh", "tools/container_start.sh"]
|
||||||
|
ports:
|
||||||
|
- "8123:8123"
|
||||||
|
volumes:
|
||||||
|
- ./dist:/app/dist
|
||||||
|
- mobilemodels_app_data:/data
|
||||||
|
extra_hosts:
|
||||||
|
- "host.docker.internal:host-gateway"
|
||||||
|
restart: unless-stopped
|
||||||
|
init: true
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
mobilemodels_app_data:
|
||||||
@@ -0,0 +1,18 @@
|
|||||||
|
# 项目文档
|
||||||
|
|
||||||
|
交付版文档统一收敛到本目录,便于部署、培训和对外交付。
|
||||||
|
|
||||||
|
## 文档索引
|
||||||
|
|
||||||
|
- [部署与使用说明](web-ui.md)
|
||||||
|
- [MySQL 设计说明](mysql-query-design.md)
|
||||||
|
- [索引构建与设备映射说明](device-mapper.md)
|
||||||
|
|
||||||
|
## 目录说明
|
||||||
|
|
||||||
|
- `web-ui.md`
|
||||||
|
- Docker Compose 启动、页面入口、MySQL 连接和管理能力说明
|
||||||
|
- `mysql-query-design.md`
|
||||||
|
- 主表设计、兼容视图、推荐查询方式
|
||||||
|
- `device-mapper.md`
|
||||||
|
- `dist/device_index.json` 构建方式与索引字段说明
|
||||||
@@ -0,0 +1,57 @@
|
|||||||
|
# Device Mapper Usage
|
||||||
|
|
||||||
|
This tool builds a cross-platform lookup index from `workspace/brands/*.md`.
|
||||||
|
|
||||||
|
## 1) Build index
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python3 tools/device_mapper.py build
|
||||||
|
```
|
||||||
|
|
||||||
|
Output file: `dist/device_index.json`
|
||||||
|
|
||||||
|
## 2) Query from command line
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python3 tools/device_mapper.py find --name 'iPhone14,5' --brand Apple
|
||||||
|
python3 tools/device_mapper.py find --name 'M2102J2SC' --brand Xiaomi
|
||||||
|
python3 tools/device_mapper.py find --name 'L55M5-AD' --brand Xiaomi
|
||||||
|
```
|
||||||
|
|
||||||
|
## 3) JSON structure
|
||||||
|
|
||||||
|
- `records`: normalized device records
|
||||||
|
- `device_name`: standard marketing name
|
||||||
|
- `brand`: normalized brand
|
||||||
|
- `manufacturer_brand`: manufacturer-level brand
|
||||||
|
- `market_brand`: market sub-brand
|
||||||
|
- `device_type`: `phone | tablet | wear | tv | other`
|
||||||
|
- `aliases`: all searchable aliases
|
||||||
|
- `lookup`: normalized alias -> candidate `record.id[]`
|
||||||
|
- `brand_aliases`: normalized brand aliases to filter by app-provided brand
|
||||||
|
- `brand_management`: brand governance metadata
|
||||||
|
|
||||||
|
## 4) App-side integration
|
||||||
|
|
||||||
|
1. Load `dist/device_index.json` into memory.
|
||||||
|
2. Normalize input `name` and optional `brand`.
|
||||||
|
3. Use `lookup[normalized_name]` to fetch candidate records.
|
||||||
|
4. Normalize brand via `brand_management`.
|
||||||
|
5. Filter records by normalized manufacturer or market brand when needed.
|
||||||
|
6. Return first candidate or all candidates.
|
||||||
|
|
||||||
|
Normalization rule:
|
||||||
|
|
||||||
|
- lower-case
|
||||||
|
- keep only `[0-9a-z\u4e00-\u9fff]`
|
||||||
|
- remove spaces, hyphens, underscores and punctuation
|
||||||
|
|
||||||
|
## 5) Device type mapping
|
||||||
|
|
||||||
|
Supported categories:
|
||||||
|
|
||||||
|
- `phone`
|
||||||
|
- `tablet`
|
||||||
|
- `wear`
|
||||||
|
- `tv`
|
||||||
|
- `other`
|
||||||
@@ -0,0 +1,142 @@
|
|||||||
|
# MySQL 设计说明
|
||||||
|
|
||||||
|
本文档说明交付版 MobileModels 的 MySQL 数据组织方式、兼容层设计与推荐查询方式。
|
||||||
|
|
||||||
|
## 设计目标
|
||||||
|
|
||||||
|
- 所有设备标识都能落到 MySQL 查询
|
||||||
|
- 支持第三方直接查库,保证查询速度
|
||||||
|
- 保留兼容旧结构的访问方式
|
||||||
|
- 页面侧和 SQL 接入侧统一使用同一份设备数据
|
||||||
|
|
||||||
|
## 主表
|
||||||
|
|
||||||
|
主推物理表:
|
||||||
|
|
||||||
|
```sql
|
||||||
|
mobilemodels.mm_device_catalog
|
||||||
|
```
|
||||||
|
|
||||||
|
主表整合了设备型号、品牌、厂商、来源、别名归一化结果和兼容字段,适合作为统一查询入口。
|
||||||
|
|
||||||
|
### 关键字段
|
||||||
|
|
||||||
|
- `model`
|
||||||
|
- `record_id`
|
||||||
|
- `alias_norm`
|
||||||
|
- `device_name`
|
||||||
|
- `brand`
|
||||||
|
- `manufacturer_brand`
|
||||||
|
- `parent_brand`
|
||||||
|
- `market_brand`
|
||||||
|
- `device_type`
|
||||||
|
- `source_file`
|
||||||
|
- `section`
|
||||||
|
- `source_rank`
|
||||||
|
- `source_weight`
|
||||||
|
- `code`
|
||||||
|
- `code_alias`
|
||||||
|
- `ver_name`
|
||||||
|
|
||||||
|
## 推荐查询方式
|
||||||
|
|
||||||
|
### 1. 第三方直接查主表
|
||||||
|
|
||||||
|
推荐按 `alias_norm` 等值查询:
|
||||||
|
|
||||||
|
```sql
|
||||||
|
SELECT
|
||||||
|
model,
|
||||||
|
record_id,
|
||||||
|
alias_norm,
|
||||||
|
device_name,
|
||||||
|
brand,
|
||||||
|
manufacturer_brand,
|
||||||
|
parent_brand,
|
||||||
|
market_brand,
|
||||||
|
device_type,
|
||||||
|
source_file,
|
||||||
|
section,
|
||||||
|
source_rank,
|
||||||
|
source_weight,
|
||||||
|
code,
|
||||||
|
code_alias,
|
||||||
|
ver_name
|
||||||
|
FROM mobilemodels.mm_device_catalog
|
||||||
|
WHERE alias_norm = ?
|
||||||
|
ORDER BY source_rank ASC, record_id ASC
|
||||||
|
LIMIT 20;
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. 页面 SQL 查询
|
||||||
|
|
||||||
|
页面的 `SQL 查询` tab 也是基于这张主表。
|
||||||
|
|
||||||
|
查询流程:
|
||||||
|
|
||||||
|
1. 接收客户端原始上报值
|
||||||
|
2. 服务端归一化为 `alias_norm`
|
||||||
|
3. 按主表等值查询
|
||||||
|
4. 返回结果列表、执行 SQL 和 JSON 输出
|
||||||
|
|
||||||
|
## 兼容视图
|
||||||
|
|
||||||
|
为了兼容旧系统,当前仍保留以下视图:
|
||||||
|
|
||||||
|
```sql
|
||||||
|
mobilemodels.mm_device_lookup
|
||||||
|
mobilemodels.mm_device_record
|
||||||
|
mobilemodels.models
|
||||||
|
python_services_test.models
|
||||||
|
```
|
||||||
|
|
||||||
|
其中旧结构 `python_services_test.models` 主要用于兼容既有查询逻辑,不再作为主推接入方式。
|
||||||
|
|
||||||
|
## 兼容旧结构查询示例
|
||||||
|
|
||||||
|
```sql
|
||||||
|
SELECT
|
||||||
|
model,
|
||||||
|
dtype,
|
||||||
|
brand,
|
||||||
|
brand_title,
|
||||||
|
code,
|
||||||
|
code_alias,
|
||||||
|
model_name,
|
||||||
|
ver_name
|
||||||
|
FROM python_services_test.models
|
||||||
|
WHERE model = ?
|
||||||
|
LIMIT 20;
|
||||||
|
```
|
||||||
|
|
||||||
|
## 归一化规则
|
||||||
|
|
||||||
|
`alias_norm` 统一按以下规则生成:
|
||||||
|
|
||||||
|
- 全部转小写
|
||||||
|
- 仅保留 `[0-9a-z中文]`
|
||||||
|
- 去掉空格、横线、下划线和其他标点
|
||||||
|
|
||||||
|
示例:
|
||||||
|
|
||||||
|
```text
|
||||||
|
SM-G9980 -> smg9980
|
||||||
|
iPhone14,2 -> iphone142
|
||||||
|
NOH-AL00 -> nohal00
|
||||||
|
```
|
||||||
|
|
||||||
|
## 数据来源
|
||||||
|
|
||||||
|
主表和索引数据均由以下流程生成:
|
||||||
|
|
||||||
|
1. 同步上游原始 markdown 数据
|
||||||
|
2. 解析 `workspace/brands/*.md`
|
||||||
|
3. 构建 `dist/device_index.json`
|
||||||
|
4. 导出 `dist/mobilemodels_mysql_seed.sql`
|
||||||
|
5. 加载 MySQL schema 与 seed
|
||||||
|
|
||||||
|
## 交付建议
|
||||||
|
|
||||||
|
- 第三方新接入优先使用 `mm_device_catalog`
|
||||||
|
- 页面联调和数据库联调使用同一套原始数据与归一化规则
|
||||||
|
- 生产环境务必替换默认数据库密码
|
||||||
+165
@@ -0,0 +1,165 @@
|
|||||||
|
# Web UI
|
||||||
|
|
||||||
|
## 启动方式
|
||||||
|
|
||||||
|
在项目根目录执行:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker compose up --build -d
|
||||||
|
```
|
||||||
|
|
||||||
|
如果要连本地测试 MySQL:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker compose -f docker-compose.yml -f docker-compose.test.yml up --build -d
|
||||||
|
```
|
||||||
|
|
||||||
|
如需自定义环境变量:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cp .env.example .env
|
||||||
|
```
|
||||||
|
|
||||||
|
Compose 的环境变量来源顺序:
|
||||||
|
|
||||||
|
1. 当前 shell 环境变量
|
||||||
|
2. 项目根目录 `.env`
|
||||||
|
3. `docker-compose.yml` 中的默认值
|
||||||
|
|
||||||
|
停止服务:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker compose down
|
||||||
|
```
|
||||||
|
|
||||||
|
重置 MySQL 和运行期数据:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker compose down -v
|
||||||
|
```
|
||||||
|
|
||||||
|
## 页面入口
|
||||||
|
|
||||||
|
- `http://127.0.0.1:8123/web/device_query.html`:设备查询
|
||||||
|
- `http://127.0.0.1:8123/web/brand_management.html`:数据管理
|
||||||
|
- `http://127.0.0.1:8123/web/device_query.html?view=docs`:相关文档
|
||||||
|
|
||||||
|
整个功能栈统一运行在 Docker Compose 中,不再依赖本地 Python。
|
||||||
|
|
||||||
|
原始数据工作空间位于项目内的 `workspace/` 目录。
|
||||||
|
|
||||||
|
## 启动后自动完成的动作
|
||||||
|
|
||||||
|
- 从 `workspace/brands` 构建设备索引
|
||||||
|
- 生成 `dist/device_index.json`
|
||||||
|
- 导出 MySQL seed 文件
|
||||||
|
- 如开启 `MYSQL_AUTO_LOAD=1`,则加载 MySQL schema 与 seed 数据
|
||||||
|
- 启动项目内置的每日自动同步调度器
|
||||||
|
- 启动 Web 页面与 API 服务
|
||||||
|
|
||||||
|
首次启动默认值仍来自环境变量;之后可在 Web UI 中修改自动装载开关,运行期配置会持久化到 `/data/state/mysql_settings.json`。
|
||||||
|
|
||||||
|
## MySQL 默认连接
|
||||||
|
|
||||||
|
- Host: `127.0.0.1`
|
||||||
|
- Port: `3306`
|
||||||
|
- Database: `mobilemodels`
|
||||||
|
- Reader User: `mobilemodels_reader`
|
||||||
|
|
||||||
|
如需自定义账号密码,请使用 `.env` 覆盖默认值。
|
||||||
|
|
||||||
|
常用变量:
|
||||||
|
|
||||||
|
- `MYSQL_HOST`
|
||||||
|
- `MYSQL_PORT`
|
||||||
|
- `TZ`
|
||||||
|
- `MYSQL_ROOT_USER`
|
||||||
|
- `MYSQL_ROOT_PASSWORD`
|
||||||
|
- `MYSQL_DATABASE`
|
||||||
|
- `MYSQL_READER_USER`
|
||||||
|
- `MYSQL_READER_PASSWORD`
|
||||||
|
- `MYSQL_AUTO_LOAD`
|
||||||
|
- `SYNC_SCHEDULE_ENABLED`
|
||||||
|
- `SYNC_SCHEDULE_TIME`
|
||||||
|
- `GITHUB_PROXY_PREFIX`
|
||||||
|
|
||||||
|
## MySQL 模式
|
||||||
|
|
||||||
|
- 主配置 `docker-compose.yml`
|
||||||
|
- 面向远程 MySQL
|
||||||
|
- 默认不自动装载 schema/seed
|
||||||
|
- 测试配置 `docker-compose.test.yml`
|
||||||
|
- 额外启动一个本地测试 MySQL
|
||||||
|
- 应用容器会自动把数据加载进去
|
||||||
|
|
||||||
|
## 设备查询
|
||||||
|
|
||||||
|
页面顶部统一提供三个导航入口:
|
||||||
|
|
||||||
|
- `设备查询`
|
||||||
|
- `数据管理`
|
||||||
|
- `相关文档`
|
||||||
|
|
||||||
|
设备查询页顶部包含两个页内 tab:
|
||||||
|
|
||||||
|
- `SQL 查询`
|
||||||
|
- `索引查询`
|
||||||
|
|
||||||
|
### SQL 查询
|
||||||
|
|
||||||
|
- 直接调用 Compose 内 API 查询 MySQL 主表 `mobilemodels.mm_device_catalog`
|
||||||
|
- 服务端先将输入归一化为 `alias_norm`
|
||||||
|
- 页面展示实际执行的 SQL、返回结果和 JSON
|
||||||
|
- 页面同时展示只读连接参数,便于核对配置
|
||||||
|
|
||||||
|
### 索引查询
|
||||||
|
|
||||||
|
- 基于 `dist/device_index.json` 内存索引进行快速识别
|
||||||
|
- 适合前端联调、接口对比和结果核验
|
||||||
|
|
||||||
|
### 平台输入建议
|
||||||
|
|
||||||
|
- Android / iOS / HarmonyOS:直接使用客户端原始上报的 `model_raw`
|
||||||
|
- 输入框会根据所选平台自动提供示例值
|
||||||
|
- 未输入时,系统会使用当前平台的默认示例值发起查询
|
||||||
|
|
||||||
|
## 数据管理
|
||||||
|
|
||||||
|
数据管理页支持:
|
||||||
|
|
||||||
|
- 品牌列表管理
|
||||||
|
- 品牌与厂商关系管理
|
||||||
|
- 品牌同义词管理
|
||||||
|
- 数据来源优先级管理
|
||||||
|
- 外部 MySQL 手动初始化
|
||||||
|
- 原始数据同步
|
||||||
|
- 每日自动同步时间点设置
|
||||||
|
- 索引数据查看与重新加载
|
||||||
|
|
||||||
|
### 外部 MySQL 手动初始化
|
||||||
|
|
||||||
|
- 页面入口:`数据管理 -> 原始数据同步 -> 初始化外部 MySQL`
|
||||||
|
- 适用于 `MYSQL_AUTO_LOAD=0` 的远程 MySQL
|
||||||
|
- 点击后会执行 schema 与 seed 导入,自动创建数据库,并重建 `mobilemodels` 相关表与视图
|
||||||
|
- 执行前请确认 `MYSQL_HOST`、`MYSQL_PORT`、`MYSQL_ROOT_USER`、`MYSQL_ROOT_PASSWORD` 指向正确且具备建库建表权限
|
||||||
|
|
||||||
|
### MySQL 自动装载开关
|
||||||
|
|
||||||
|
- 页面入口:`数据管理 -> 原始数据同步 -> MySQL 自动装载`
|
||||||
|
- 保存后会更新运行期配置 `/data/state/mysql_settings.json`
|
||||||
|
- 会影响后续“开始同步原始数据”是否自动刷新 MySQL
|
||||||
|
- 也会影响容器后续启动时是否自动执行 schema 与 seed 导入
|
||||||
|
|
||||||
|
### 每日自动同步
|
||||||
|
|
||||||
|
- 调度器运行在项目容器内部,不依赖 GitHub Actions
|
||||||
|
- 页面入口:`数据管理 -> 原始数据同步`
|
||||||
|
- 可设置是否启用,以及每天执行的时间点
|
||||||
|
- 可选配置 GitHub 加速前缀,例如 `https://ghfast.top/`
|
||||||
|
- 运行期配置持久化在 `/data/state/sync_schedule.json`
|
||||||
|
- 时间按容器时区执行,默认值来自 `TZ`,默认 `Asia/Shanghai`
|
||||||
|
|
||||||
|
## 说明
|
||||||
|
|
||||||
|
- 原始数据、索引和 MySQL seed 运行时持久化在 Docker volume 中,不回写本地工作区
|
||||||
|
- 交付环境建议覆盖默认的 `MYSQL_ROOT_PASSWORD` 和 `MYSQL_READER_PASSWORD`
|
||||||
@@ -0,0 +1,257 @@
|
|||||||
|
CREATE DATABASE IF NOT EXISTS `mobilemodels`
|
||||||
|
DEFAULT CHARACTER SET utf8mb4
|
||||||
|
DEFAULT COLLATE utf8mb4_0900_ai_ci;
|
||||||
|
|
||||||
|
CREATE DATABASE IF NOT EXISTS `python_services_test`
|
||||||
|
DEFAULT CHARACTER SET utf8mb4
|
||||||
|
DEFAULT COLLATE utf8mb4_0900_ai_ci;
|
||||||
|
|
||||||
|
SET @drop_stmt = (
|
||||||
|
SELECT CASE `TABLE_TYPE`
|
||||||
|
WHEN 'BASE TABLE' THEN 'DROP TABLE `python_services_test`.`models`'
|
||||||
|
WHEN 'VIEW' THEN 'DROP VIEW `python_services_test`.`models`'
|
||||||
|
ELSE 'DO 0'
|
||||||
|
END
|
||||||
|
FROM `information_schema`.`TABLES`
|
||||||
|
WHERE `TABLE_SCHEMA` = 'python_services_test' AND `TABLE_NAME` = 'models'
|
||||||
|
LIMIT 1
|
||||||
|
);
|
||||||
|
SET @drop_stmt = COALESCE(@drop_stmt, 'DO 0');
|
||||||
|
PREPARE stmt FROM @drop_stmt;
|
||||||
|
EXECUTE stmt;
|
||||||
|
DEALLOCATE PREPARE stmt;
|
||||||
|
|
||||||
|
USE `mobilemodels`;
|
||||||
|
|
||||||
|
SET @drop_stmt = (
|
||||||
|
SELECT CASE `TABLE_TYPE`
|
||||||
|
WHEN 'BASE TABLE' THEN 'DROP TABLE `mm_device_record`'
|
||||||
|
WHEN 'VIEW' THEN 'DROP VIEW `mm_device_record`'
|
||||||
|
ELSE 'DO 0'
|
||||||
|
END
|
||||||
|
FROM `information_schema`.`TABLES`
|
||||||
|
WHERE `TABLE_SCHEMA` = 'mobilemodels' AND `TABLE_NAME` = 'mm_device_record'
|
||||||
|
LIMIT 1
|
||||||
|
);
|
||||||
|
SET @drop_stmt = COALESCE(@drop_stmt, 'DO 0');
|
||||||
|
PREPARE stmt FROM @drop_stmt;
|
||||||
|
EXECUTE stmt;
|
||||||
|
DEALLOCATE PREPARE stmt;
|
||||||
|
|
||||||
|
SET @drop_stmt = (
|
||||||
|
SELECT CASE `TABLE_TYPE`
|
||||||
|
WHEN 'BASE TABLE' THEN 'DROP TABLE `mm_device_lookup`'
|
||||||
|
WHEN 'VIEW' THEN 'DROP VIEW `mm_device_lookup`'
|
||||||
|
ELSE 'DO 0'
|
||||||
|
END
|
||||||
|
FROM `information_schema`.`TABLES`
|
||||||
|
WHERE `TABLE_SCHEMA` = 'mobilemodels' AND `TABLE_NAME` = 'mm_device_lookup'
|
||||||
|
LIMIT 1
|
||||||
|
);
|
||||||
|
SET @drop_stmt = COALESCE(@drop_stmt, 'DO 0');
|
||||||
|
PREPARE stmt FROM @drop_stmt;
|
||||||
|
EXECUTE stmt;
|
||||||
|
DEALLOCATE PREPARE stmt;
|
||||||
|
|
||||||
|
SET @drop_stmt = (
|
||||||
|
SELECT CASE `TABLE_TYPE`
|
||||||
|
WHEN 'BASE TABLE' THEN 'DROP TABLE `models`'
|
||||||
|
WHEN 'VIEW' THEN 'DROP VIEW `models`'
|
||||||
|
ELSE 'DO 0'
|
||||||
|
END
|
||||||
|
FROM `information_schema`.`TABLES`
|
||||||
|
WHERE `TABLE_SCHEMA` = 'mobilemodels' AND `TABLE_NAME` = 'models'
|
||||||
|
LIMIT 1
|
||||||
|
);
|
||||||
|
SET @drop_stmt = COALESCE(@drop_stmt, 'DO 0');
|
||||||
|
PREPARE stmt FROM @drop_stmt;
|
||||||
|
EXECUTE stmt;
|
||||||
|
DEALLOCATE PREPARE stmt;
|
||||||
|
|
||||||
|
SET @drop_stmt = (
|
||||||
|
SELECT CASE `TABLE_TYPE`
|
||||||
|
WHEN 'BASE TABLE' THEN 'DROP TABLE `vw_mm_device_lookup`'
|
||||||
|
WHEN 'VIEW' THEN 'DROP VIEW `vw_mm_device_lookup`'
|
||||||
|
ELSE 'DO 0'
|
||||||
|
END
|
||||||
|
FROM `information_schema`.`TABLES`
|
||||||
|
WHERE `TABLE_SCHEMA` = 'mobilemodels' AND `TABLE_NAME` = 'vw_mm_device_lookup'
|
||||||
|
LIMIT 1
|
||||||
|
);
|
||||||
|
SET @drop_stmt = COALESCE(@drop_stmt, 'DO 0');
|
||||||
|
PREPARE stmt FROM @drop_stmt;
|
||||||
|
EXECUTE stmt;
|
||||||
|
DEALLOCATE PREPARE stmt;
|
||||||
|
|
||||||
|
SET @drop_stmt = (
|
||||||
|
SELECT CASE `TABLE_TYPE`
|
||||||
|
WHEN 'BASE TABLE' THEN 'DROP TABLE `vw_models`'
|
||||||
|
WHEN 'VIEW' THEN 'DROP VIEW `vw_models`'
|
||||||
|
ELSE 'DO 0'
|
||||||
|
END
|
||||||
|
FROM `information_schema`.`TABLES`
|
||||||
|
WHERE `TABLE_SCHEMA` = 'mobilemodels' AND `TABLE_NAME` = 'vw_models'
|
||||||
|
LIMIT 1
|
||||||
|
);
|
||||||
|
SET @drop_stmt = COALESCE(@drop_stmt, 'DO 0');
|
||||||
|
PREPARE stmt FROM @drop_stmt;
|
||||||
|
EXECUTE stmt;
|
||||||
|
DEALLOCATE PREPARE stmt;
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS `mm_device_catalog` (
|
||||||
|
`record_id` varchar(64) NOT NULL,
|
||||||
|
`model` varchar(191) NOT NULL,
|
||||||
|
`alias_norm` varchar(191) NOT NULL,
|
||||||
|
`device_name` varchar(255) NOT NULL,
|
||||||
|
`brand` varchar(64) NOT NULL,
|
||||||
|
`manufacturer_brand` varchar(64) NOT NULL,
|
||||||
|
`parent_brand` varchar(64) NOT NULL,
|
||||||
|
`market_brand` varchar(64) NOT NULL,
|
||||||
|
`device_type` enum('phone','tablet','wear','tv','other') NOT NULL,
|
||||||
|
`code` varchar(64) DEFAULT NULL,
|
||||||
|
`code_alias` varchar(255) DEFAULT NULL,
|
||||||
|
`ver_name` text DEFAULT NULL,
|
||||||
|
`source_file` varchar(255) NOT NULL,
|
||||||
|
`section` varchar(255) NOT NULL,
|
||||||
|
`source_rank` int NOT NULL,
|
||||||
|
`source_weight` decimal(6,3) NOT NULL,
|
||||||
|
`updated_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||||
|
`hash_md5` char(32) GENERATED ALWAYS AS (
|
||||||
|
md5(concat_ws(_utf8mb4'|', `model`, `device_type`, `market_brand`, `manufacturer_brand`, `code`, `code_alias`, `device_name`, `ver_name`))
|
||||||
|
) STORED,
|
||||||
|
`hash_crc` int unsigned GENERATED ALWAYS AS (
|
||||||
|
crc32(concat_ws(_utf8mb4'|', `model`, `device_type`, `market_brand`, `manufacturer_brand`, `code`, `code_alias`, `device_name`, `ver_name`))
|
||||||
|
) STORED,
|
||||||
|
PRIMARY KEY (`record_id`, `model`),
|
||||||
|
KEY `idx_mm_device_catalog_alias_norm` (`alias_norm`, `source_rank`, `record_id`),
|
||||||
|
KEY `idx_mm_device_catalog_model` (`model`),
|
||||||
|
KEY `idx_mm_device_catalog_market_brand` (`market_brand`),
|
||||||
|
KEY `idx_mm_device_catalog_parent_brand` (`parent_brand`),
|
||||||
|
KEY `idx_mm_device_catalog_manufacturer_brand` (`manufacturer_brand`),
|
||||||
|
KEY `idx_mm_device_catalog_device_type` (`device_type`),
|
||||||
|
KEY `idx_mm_device_catalog_code` (`code`),
|
||||||
|
KEY `idx_mm_device_catalog_hash` (`hash_md5`, `hash_crc`)
|
||||||
|
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci;
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS `mm_brand_lookup` (
|
||||||
|
`alias_norm` varchar(191) NOT NULL,
|
||||||
|
`alias_type` enum('manufacturer','parent','market') NOT NULL,
|
||||||
|
`canonical_brand` varchar(64) NOT NULL,
|
||||||
|
`manufacturer_brand` varchar(64) DEFAULT NULL,
|
||||||
|
`parent_brand` varchar(64) DEFAULT NULL,
|
||||||
|
`market_brand` varchar(64) DEFAULT NULL,
|
||||||
|
`updated_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||||
|
PRIMARY KEY (`alias_norm`, `alias_type`),
|
||||||
|
KEY `idx_mm_brand_lookup_canonical_brand` (`canonical_brand`),
|
||||||
|
KEY `idx_mm_brand_lookup_manufacturer_brand` (`manufacturer_brand`),
|
||||||
|
KEY `idx_mm_brand_lookup_parent_brand` (`parent_brand`),
|
||||||
|
KEY `idx_mm_brand_lookup_market_brand` (`market_brand`)
|
||||||
|
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci;
|
||||||
|
|
||||||
|
CREATE OR REPLACE VIEW `mm_device_lookup` AS
|
||||||
|
SELECT
|
||||||
|
c.`alias_norm`,
|
||||||
|
c.`record_id`,
|
||||||
|
c.`device_name`,
|
||||||
|
c.`brand`,
|
||||||
|
c.`manufacturer_brand`,
|
||||||
|
c.`parent_brand`,
|
||||||
|
c.`market_brand`,
|
||||||
|
c.`device_type`,
|
||||||
|
c.`source_file`,
|
||||||
|
c.`section`,
|
||||||
|
c.`source_rank`,
|
||||||
|
c.`source_weight`,
|
||||||
|
c.`updated_at`
|
||||||
|
FROM `mm_device_catalog` AS c;
|
||||||
|
|
||||||
|
CREATE OR REPLACE VIEW `mm_device_record` AS
|
||||||
|
SELECT
|
||||||
|
c.`record_id`,
|
||||||
|
c.`device_name`,
|
||||||
|
c.`brand`,
|
||||||
|
c.`manufacturer_brand`,
|
||||||
|
c.`parent_brand`,
|
||||||
|
c.`market_brand`,
|
||||||
|
c.`device_type`,
|
||||||
|
c.`source_file`,
|
||||||
|
c.`section`,
|
||||||
|
c.`source_rank`,
|
||||||
|
c.`source_weight`,
|
||||||
|
CAST(CONCAT('[', GROUP_CONCAT(JSON_QUOTE(c.`model`) ORDER BY c.`model` SEPARATOR ','), ']') AS JSON) AS `aliases_json`,
|
||||||
|
MAX(c.`updated_at`) AS `updated_at`
|
||||||
|
FROM `mm_device_catalog` AS c
|
||||||
|
GROUP BY
|
||||||
|
c.`record_id`,
|
||||||
|
c.`device_name`,
|
||||||
|
c.`brand`,
|
||||||
|
c.`manufacturer_brand`,
|
||||||
|
c.`parent_brand`,
|
||||||
|
c.`market_brand`,
|
||||||
|
c.`device_type`,
|
||||||
|
c.`source_file`,
|
||||||
|
c.`section`,
|
||||||
|
c.`source_rank`,
|
||||||
|
c.`source_weight`;
|
||||||
|
|
||||||
|
CREATE OR REPLACE VIEW `vw_mm_device_lookup` AS
|
||||||
|
SELECT
|
||||||
|
c.`alias_norm`,
|
||||||
|
c.`record_id`,
|
||||||
|
c.`device_name`,
|
||||||
|
c.`brand`,
|
||||||
|
c.`manufacturer_brand`,
|
||||||
|
c.`parent_brand`,
|
||||||
|
c.`market_brand`,
|
||||||
|
c.`device_type`,
|
||||||
|
c.`source_file`,
|
||||||
|
c.`section`,
|
||||||
|
c.`source_rank`,
|
||||||
|
c.`source_weight`,
|
||||||
|
c.`updated_at`
|
||||||
|
FROM `mm_device_catalog` AS c;
|
||||||
|
|
||||||
|
CREATE OR REPLACE VIEW `models` AS
|
||||||
|
SELECT
|
||||||
|
c.`model`,
|
||||||
|
c.`device_type` AS `dtype`,
|
||||||
|
c.`market_brand` AS `brand`,
|
||||||
|
c.`manufacturer_brand` AS `brand_title`,
|
||||||
|
c.`code`,
|
||||||
|
c.`code_alias`,
|
||||||
|
c.`device_name` AS `model_name`,
|
||||||
|
c.`ver_name`,
|
||||||
|
c.`updated_at` AS `update_at`,
|
||||||
|
c.`hash_md5`,
|
||||||
|
c.`hash_crc`
|
||||||
|
FROM `mm_device_catalog` AS c;
|
||||||
|
|
||||||
|
CREATE OR REPLACE VIEW `vw_models` AS
|
||||||
|
SELECT
|
||||||
|
c.`model`,
|
||||||
|
c.`device_type` AS `dtype`,
|
||||||
|
c.`market_brand` AS `brand`,
|
||||||
|
c.`manufacturer_brand` AS `brand_title`,
|
||||||
|
c.`code`,
|
||||||
|
c.`code_alias`,
|
||||||
|
c.`device_name` AS `model_name`,
|
||||||
|
c.`ver_name`,
|
||||||
|
c.`updated_at` AS `update_at`,
|
||||||
|
c.`hash_md5`,
|
||||||
|
c.`hash_crc`
|
||||||
|
FROM `mm_device_catalog` AS c;
|
||||||
|
|
||||||
|
CREATE OR REPLACE VIEW `python_services_test`.`models` AS
|
||||||
|
SELECT
|
||||||
|
`model`,
|
||||||
|
`dtype`,
|
||||||
|
`brand`,
|
||||||
|
`brand_title`,
|
||||||
|
`code`,
|
||||||
|
`code_alias`,
|
||||||
|
`model_name`,
|
||||||
|
`ver_name`,
|
||||||
|
`update_at`,
|
||||||
|
`hash_md5`,
|
||||||
|
`hash_crc`
|
||||||
|
FROM `mobilemodels`.`models`;
|
||||||
@@ -0,0 +1,39 @@
|
|||||||
|
#!/bin/sh
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
cd /app
|
||||||
|
|
||||||
|
sh tools/init_runtime_data.sh
|
||||||
|
|
||||||
|
python3 tools/device_mapper.py build
|
||||||
|
python3 tools/export_mysql_seed.py
|
||||||
|
|
||||||
|
MYSQL_AUTO_LOAD_EFFECTIVE="$(python3 - <<'PY'
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
data_root = Path(os.environ.get("MOBILEMODELS_DATA_ROOT", "/data"))
|
||||||
|
config_path = data_root / "state/mysql_settings.json"
|
||||||
|
raw_default = os.environ.get("MYSQL_AUTO_LOAD", "0").strip().lower()
|
||||||
|
value = raw_default in {"1", "true", "yes", "on"}
|
||||||
|
|
||||||
|
try:
|
||||||
|
if config_path.exists():
|
||||||
|
payload = json.loads(config_path.read_text(encoding="utf-8"))
|
||||||
|
raw = payload.get("auto_load", value) if isinstance(payload, dict) else value
|
||||||
|
value = raw if isinstance(raw, bool) else str(raw).strip().lower() in {"1", "true", "yes", "on"}
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
print("1" if value else "0")
|
||||||
|
PY
|
||||||
|
)"
|
||||||
|
|
||||||
|
if [ "$MYSQL_AUTO_LOAD_EFFECTIVE" = "1" ]; then
|
||||||
|
python3 tools/load_mysql_seed.py
|
||||||
|
else
|
||||||
|
echo "Skipping MySQL load because MYSQL_AUTO_LOAD=$MYSQL_AUTO_LOAD_EFFECTIVE"
|
||||||
|
fi
|
||||||
|
|
||||||
|
exec python3 tools/web_server.py --host 0.0.0.0 --port 8123
|
||||||
@@ -0,0 +1,759 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Build and query a cross-platform device mapping index from MobileModels markdown data."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
from collections import Counter
|
||||||
|
import json
|
||||||
|
import re
|
||||||
|
from dataclasses import asdict, dataclass
|
||||||
|
from datetime import date
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Dict, Iterable, List, Optional, Set
|
||||||
|
|
||||||
|
from project_layout import PROJECT_ROOT, WORKSPACE_ROOT
|
||||||
|
|
||||||
|
ENTRY_RE = re.compile(r"^\*\*(.+?)\*\*\s*$")
|
||||||
|
VARIANT_RE = re.compile(r"^\s*((?:`[^`]+`\s*)+):\s*(.+?)\s*$")
|
||||||
|
BACKTICK_RE = re.compile(r"`([^`]+)`")
|
||||||
|
SECTION_RE = re.compile(r"^##\s+(.+?)\s*$")
|
||||||
|
|
||||||
|
|
||||||
|
FILE_BRAND_MAP: Dict[str, str] = {
|
||||||
|
"360shouji": "360",
|
||||||
|
"apple_all": "Apple",
|
||||||
|
"apple_all_en": "Apple",
|
||||||
|
"apple_cn": "Apple",
|
||||||
|
"asus_cn": "ASUS",
|
||||||
|
"asus_en": "ASUS",
|
||||||
|
"blackshark": "Black Shark",
|
||||||
|
"blackshark_en": "Black Shark",
|
||||||
|
"coolpad": "Coolpad",
|
||||||
|
"google": "Google",
|
||||||
|
"honor_cn": "HONOR",
|
||||||
|
"honor_global_en": "HONOR",
|
||||||
|
"huawei_cn": "HUAWEI",
|
||||||
|
"huawei_global_en": "HUAWEI",
|
||||||
|
"lenovo_cn": "Lenovo",
|
||||||
|
"letv": "LeTV",
|
||||||
|
"meizu": "Meizu",
|
||||||
|
"meizu_en": "Meizu",
|
||||||
|
"mitv_cn": "Xiaomi",
|
||||||
|
"mitv_global_en": "Xiaomi",
|
||||||
|
"motorola_cn": "Motorola",
|
||||||
|
"nokia_cn": "Nokia",
|
||||||
|
"nothing": "Nothing",
|
||||||
|
"nubia": "nubia",
|
||||||
|
"oneplus": "OnePlus",
|
||||||
|
"oneplus_en": "OnePlus",
|
||||||
|
"oppo_cn": "OPPO",
|
||||||
|
"oppo_global_en": "OPPO",
|
||||||
|
"realme_cn": "realme",
|
||||||
|
"realme_global_en": "realme",
|
||||||
|
"samsung_cn": "Samsung",
|
||||||
|
"samsung_global_en": "Samsung",
|
||||||
|
"smartisan": "Smartisan",
|
||||||
|
"sony": "Sony",
|
||||||
|
"sony_cn": "Sony",
|
||||||
|
"vivo_cn": "vivo",
|
||||||
|
"vivo_global_en": "vivo",
|
||||||
|
"xiaomi": "Xiaomi",
|
||||||
|
"xiaomi_cn": "Xiaomi",
|
||||||
|
"xiaomi_en": "Xiaomi",
|
||||||
|
"xiaomi-wear": "Xiaomi",
|
||||||
|
"zhixuan": "HUAWEI Smart Selection",
|
||||||
|
"zte_cn": "ZTE",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
FILE_DEFAULT_DEVICE_TYPE: Dict[str, str] = {
|
||||||
|
"mitv_cn": "tv",
|
||||||
|
"mitv_global_en": "tv",
|
||||||
|
"xiaomi-wear": "wear",
|
||||||
|
"apple_all": "phone",
|
||||||
|
"apple_all_en": "phone",
|
||||||
|
"apple_cn": "phone",
|
||||||
|
"google": "phone",
|
||||||
|
"honor_cn": "phone",
|
||||||
|
"honor_global_en": "phone",
|
||||||
|
"huawei_cn": "phone",
|
||||||
|
"huawei_global_en": "phone",
|
||||||
|
"xiaomi": "phone",
|
||||||
|
"xiaomi_cn": "phone",
|
||||||
|
"xiaomi_en": "phone",
|
||||||
|
"zhixuan": "phone",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
BRAND_ALIASES: Dict[str, List[str]] = {
|
||||||
|
"360": ["360", "360手机", "奇酷", "qiku"],
|
||||||
|
"Apple": ["apple", "苹果", "iphone", "ipad", "ipod"],
|
||||||
|
"ASUS": ["asus", "华硕", "rog", "zenfone"],
|
||||||
|
"Black Shark": ["black shark", "blackshark", "黑鲨"],
|
||||||
|
"Coolpad": ["coolpad", "酷派"],
|
||||||
|
"Google": ["google", "pixel"],
|
||||||
|
"HONOR": ["honor", "荣耀"],
|
||||||
|
"HUAWEI": ["huawei", "华为"],
|
||||||
|
"HUAWEI Smart Selection": ["华为智选", "zhixuan", "umagic", "wiko", "hi nova", "nzone"],
|
||||||
|
"Lenovo": ["lenovo", "联想", "zuk", "拯救者"],
|
||||||
|
"LeTV": ["letv", "乐视"],
|
||||||
|
"Meizu": ["meizu", "魅族"],
|
||||||
|
"Motorola": ["motorola", "摩托罗拉", "moto"],
|
||||||
|
"Nokia": ["nokia", "诺基亚"],
|
||||||
|
"Nothing": ["nothing", "cmf"],
|
||||||
|
"nubia": ["nubia", "努比亚", "红魔", "redmagic"],
|
||||||
|
"iQOO": ["iqoo", "i qoo", "艾酷"],
|
||||||
|
"OnePlus": ["oneplus", "一加"],
|
||||||
|
"OPPO": ["oppo"],
|
||||||
|
"POCO": ["poco"],
|
||||||
|
"Redmi": ["redmi", "红米", "hongmi"],
|
||||||
|
"realme": ["realme", "真我"],
|
||||||
|
"Samsung": ["samsung", "三星", "galaxy"],
|
||||||
|
"Smartisan": ["smartisan", "锤子", "坚果"],
|
||||||
|
"Sony": ["sony", "索尼", "xperia"],
|
||||||
|
"vivo": ["vivo"],
|
||||||
|
"Xiaomi": ["xiaomi", "小米", "mi", "米家", "mipad"],
|
||||||
|
"ZTE": ["zte", "中兴"],
|
||||||
|
}
|
||||||
|
|
||||||
|
MANUFACTURER_PARENT_BRAND: Dict[str, str] = {
|
||||||
|
"Black Shark": "Xiaomi",
|
||||||
|
"HUAWEI Smart Selection": "HUAWEI",
|
||||||
|
"Motorola": "Lenovo",
|
||||||
|
"iQOO": "vivo",
|
||||||
|
"POCO": "Xiaomi",
|
||||||
|
"Redmi": "Xiaomi",
|
||||||
|
"OnePlus": "OPPO",
|
||||||
|
"realme": "OPPO",
|
||||||
|
"nubia": "ZTE",
|
||||||
|
}
|
||||||
|
|
||||||
|
MARKET_BRAND_ALIASES: Dict[str, List[str]] = {
|
||||||
|
"iQOO": ["iqoo", "i qoo", "艾酷"],
|
||||||
|
"POCO": ["poco"],
|
||||||
|
"Redmi": ["redmi", "红米", "hongmi"],
|
||||||
|
"Xiaomi": ["xiaomi", "小米", "mi", "mipad", "米家"],
|
||||||
|
}
|
||||||
|
|
||||||
|
MARKET_BRAND_TO_MANUFACTURER: Dict[str, str] = {
|
||||||
|
"iQOO": "vivo",
|
||||||
|
"POCO": "Xiaomi",
|
||||||
|
"Redmi": "Xiaomi",
|
||||||
|
"Xiaomi": "Xiaomi",
|
||||||
|
}
|
||||||
|
|
||||||
|
TV_KEYWORDS = [
|
||||||
|
"tv",
|
||||||
|
"电视",
|
||||||
|
"智慧屏",
|
||||||
|
"smart tv",
|
||||||
|
"机顶盒",
|
||||||
|
"tv box",
|
||||||
|
"stick",
|
||||||
|
"dongle",
|
||||||
|
]
|
||||||
|
TABLET_KEYWORDS = [
|
||||||
|
"ipad",
|
||||||
|
"tablet",
|
||||||
|
"tab",
|
||||||
|
"pad",
|
||||||
|
"平板",
|
||||||
|
"matepad",
|
||||||
|
]
|
||||||
|
WEAR_KEYWORDS = [
|
||||||
|
"watch",
|
||||||
|
"smartwatch",
|
||||||
|
"手表",
|
||||||
|
"手环",
|
||||||
|
"band",
|
||||||
|
"wear",
|
||||||
|
"wearable",
|
||||||
|
"buds",
|
||||||
|
"earbuds",
|
||||||
|
"耳机",
|
||||||
|
"tws",
|
||||||
|
"eyewear",
|
||||||
|
"glasses",
|
||||||
|
"眼镜",
|
||||||
|
]
|
||||||
|
OTHER_KEYWORDS = [
|
||||||
|
"matebook",
|
||||||
|
"笔记本",
|
||||||
|
"laptop",
|
||||||
|
"notebook",
|
||||||
|
"vision",
|
||||||
|
"vr",
|
||||||
|
"ipod",
|
||||||
|
"airpods",
|
||||||
|
]
|
||||||
|
PHONE_KEYWORDS = [
|
||||||
|
"iphone",
|
||||||
|
"phone",
|
||||||
|
"手机",
|
||||||
|
"galaxy",
|
||||||
|
"pixel",
|
||||||
|
"xiaomi",
|
||||||
|
"redmi",
|
||||||
|
"poco",
|
||||||
|
"honor",
|
||||||
|
"huawei",
|
||||||
|
"mate",
|
||||||
|
"nova",
|
||||||
|
"oppo",
|
||||||
|
"vivo",
|
||||||
|
"realme",
|
||||||
|
"oneplus",
|
||||||
|
"nokia",
|
||||||
|
"nubia",
|
||||||
|
"meizu",
|
||||||
|
"lenovo",
|
||||||
|
"motorola",
|
||||||
|
"zte",
|
||||||
|
"smartisan",
|
||||||
|
"zenfone",
|
||||||
|
"rog",
|
||||||
|
"麦芒",
|
||||||
|
"畅享",
|
||||||
|
"优畅享",
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class DeviceRecord:
|
||||||
|
id: str
|
||||||
|
device_name: str
|
||||||
|
brand: str
|
||||||
|
manufacturer_brand: str
|
||||||
|
parent_brand: str
|
||||||
|
market_brand: str
|
||||||
|
device_type: str
|
||||||
|
aliases: List[str]
|
||||||
|
source_file: str
|
||||||
|
section: str
|
||||||
|
|
||||||
|
|
||||||
|
def normalize_text(text: str) -> str:
|
||||||
|
return re.sub(r"[^0-9a-z\u4e00-\u9fff]+", "", text.lower())
|
||||||
|
|
||||||
|
|
||||||
|
def canonical_brand(file_stem: str) -> str:
|
||||||
|
return FILE_BRAND_MAP.get(file_stem, file_stem)
|
||||||
|
|
||||||
|
|
||||||
|
def brand_aliases(brand: str) -> List[str]:
|
||||||
|
aliases = set(BRAND_ALIASES.get(brand, []))
|
||||||
|
aliases.add(brand)
|
||||||
|
return sorted(aliases)
|
||||||
|
|
||||||
|
|
||||||
|
def has_keyword(text: str, keywords: Iterable[str]) -> bool:
|
||||||
|
norm_text = normalize_text(text)
|
||||||
|
for kw in keywords:
|
||||||
|
if normalize_text(kw) and normalize_text(kw) in norm_text:
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def resolve_parent_brand(manufacturer_brand: str) -> str:
|
||||||
|
return MANUFACTURER_PARENT_BRAND.get(manufacturer_brand, manufacturer_brand)
|
||||||
|
|
||||||
|
|
||||||
|
def infer_market_brand(
|
||||||
|
manufacturer_brand: str,
|
||||||
|
device_name: str,
|
||||||
|
section: str,
|
||||||
|
aliases: Iterable[str],
|
||||||
|
) -> str:
|
||||||
|
corpus = normalize_text(" ".join([device_name, section, *aliases]))
|
||||||
|
|
||||||
|
if manufacturer_brand == "Xiaomi":
|
||||||
|
poco_keys = [normalize_text(v) for v in MARKET_BRAND_ALIASES["POCO"]]
|
||||||
|
redmi_keys = [normalize_text(v) for v in MARKET_BRAND_ALIASES["Redmi"]]
|
||||||
|
if any(key and key in corpus for key in poco_keys):
|
||||||
|
return "POCO"
|
||||||
|
if any(key and key in corpus for key in redmi_keys):
|
||||||
|
return "Redmi"
|
||||||
|
return "Xiaomi"
|
||||||
|
|
||||||
|
if manufacturer_brand == "vivo":
|
||||||
|
iqoo_keys = [normalize_text(v) for v in MARKET_BRAND_ALIASES["iQOO"]]
|
||||||
|
if any(key and key in corpus for key in iqoo_keys):
|
||||||
|
return "iQOO"
|
||||||
|
return "vivo"
|
||||||
|
|
||||||
|
return manufacturer_brand
|
||||||
|
|
||||||
|
|
||||||
|
def infer_device_type(
|
||||||
|
device_name: str,
|
||||||
|
section: str,
|
||||||
|
source_file: str,
|
||||||
|
aliases: Iterable[str],
|
||||||
|
default_type: str,
|
||||||
|
) -> str:
|
||||||
|
corpus = " ".join([device_name, section, *aliases, source_file])
|
||||||
|
|
||||||
|
if has_keyword(corpus, TV_KEYWORDS):
|
||||||
|
return "tv"
|
||||||
|
if has_keyword(corpus, TABLET_KEYWORDS):
|
||||||
|
return "tablet"
|
||||||
|
if has_keyword(corpus, WEAR_KEYWORDS):
|
||||||
|
return "wear"
|
||||||
|
if has_keyword(corpus, OTHER_KEYWORDS):
|
||||||
|
return "other"
|
||||||
|
if has_keyword(corpus, PHONE_KEYWORDS):
|
||||||
|
return "phone"
|
||||||
|
return default_type or "other"
|
||||||
|
|
||||||
|
|
||||||
|
def clean_entry_title(raw_title: str) -> str:
|
||||||
|
title = raw_title.strip()
|
||||||
|
if title.endswith(":"):
|
||||||
|
title = title[:-1].strip()
|
||||||
|
|
||||||
|
# remove leading tag like: [`X1`] or [X1]
|
||||||
|
title = re.sub(r"^\[[^\]]+\]\s*", "", title)
|
||||||
|
|
||||||
|
# remove one or more trailing codenames like: (`foo`) (`bar`)
|
||||||
|
title = re.sub(r"(?:\s*\(\s*`[^`]+`\s*\))+\s*$", "", title)
|
||||||
|
title = re.sub(r"\s*\((?:codename|代号)[^)]*\)\s*$", "", title, flags=re.IGNORECASE)
|
||||||
|
|
||||||
|
# strip markdown links while keeping text: [Foo](url) -> Foo
|
||||||
|
title = re.sub(r"\[([^\]]+)\]\([^)]*\)", r"\1", title)
|
||||||
|
|
||||||
|
title = " ".join(title.split())
|
||||||
|
return title
|
||||||
|
|
||||||
|
|
||||||
|
def extract_codes(text: str) -> List[str]:
|
||||||
|
return [code.strip() for code in BACKTICK_RE.findall(text) if code.strip()]
|
||||||
|
|
||||||
|
|
||||||
|
def parse_brand_file(path: Path) -> List[DeviceRecord]:
|
||||||
|
file_stem = path.stem
|
||||||
|
brand = canonical_brand(file_stem)
|
||||||
|
default_type = FILE_DEFAULT_DEVICE_TYPE.get(file_stem, "phone")
|
||||||
|
|
||||||
|
records: List[DeviceRecord] = []
|
||||||
|
lines = path.read_text(encoding="utf-8").splitlines()
|
||||||
|
|
||||||
|
section = ""
|
||||||
|
current_title = ""
|
||||||
|
current_aliases: Set[str] = set()
|
||||||
|
|
||||||
|
def flush_current() -> None:
|
||||||
|
nonlocal current_title, current_aliases
|
||||||
|
if not current_title:
|
||||||
|
return
|
||||||
|
|
||||||
|
aliases = sorted({alias.strip() for alias in current_aliases if alias.strip()})
|
||||||
|
record_id = f"{file_stem}:{len(records) + 1}"
|
||||||
|
device_type = infer_device_type(
|
||||||
|
device_name=current_title,
|
||||||
|
section=section,
|
||||||
|
source_file=path.name,
|
||||||
|
aliases=aliases,
|
||||||
|
default_type=default_type,
|
||||||
|
)
|
||||||
|
records.append(
|
||||||
|
DeviceRecord(
|
||||||
|
id=record_id,
|
||||||
|
device_name=current_title,
|
||||||
|
brand=brand,
|
||||||
|
manufacturer_brand=brand,
|
||||||
|
parent_brand=resolve_parent_brand(brand),
|
||||||
|
market_brand=infer_market_brand(
|
||||||
|
manufacturer_brand=brand,
|
||||||
|
device_name=current_title,
|
||||||
|
section=section,
|
||||||
|
aliases=aliases,
|
||||||
|
),
|
||||||
|
device_type=device_type,
|
||||||
|
aliases=aliases,
|
||||||
|
source_file=f"brands/{path.name}",
|
||||||
|
section=section,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
current_title = ""
|
||||||
|
current_aliases = set()
|
||||||
|
|
||||||
|
for raw in lines:
|
||||||
|
line = raw.strip()
|
||||||
|
if not line:
|
||||||
|
continue
|
||||||
|
|
||||||
|
section_match = SECTION_RE.match(line)
|
||||||
|
if section_match:
|
||||||
|
section = section_match.group(1).strip()
|
||||||
|
continue
|
||||||
|
|
||||||
|
entry_match = ENTRY_RE.match(line)
|
||||||
|
if entry_match:
|
||||||
|
flush_current()
|
||||||
|
raw_title = entry_match.group(1).strip()
|
||||||
|
current_title = clean_entry_title(raw_title)
|
||||||
|
current_aliases = set(extract_codes(raw_title))
|
||||||
|
current_aliases.add(current_title)
|
||||||
|
continue
|
||||||
|
|
||||||
|
if not current_title:
|
||||||
|
continue
|
||||||
|
|
||||||
|
variant_match = VARIANT_RE.match(line)
|
||||||
|
if variant_match:
|
||||||
|
variant_codes = extract_codes(variant_match.group(1))
|
||||||
|
variant_name = variant_match.group(2).strip()
|
||||||
|
current_aliases.update(variant_codes)
|
||||||
|
current_aliases.add(variant_name)
|
||||||
|
|
||||||
|
flush_current()
|
||||||
|
return records
|
||||||
|
|
||||||
|
|
||||||
|
class DeviceMapper:
|
||||||
|
def __init__(self, records: List[DeviceRecord]) -> None:
|
||||||
|
self.records = records
|
||||||
|
self.records_by_id = {record.id: record for record in records}
|
||||||
|
self.manufacturer_alias_lookup: Dict[str, str] = {}
|
||||||
|
self.parent_alias_lookup: Dict[str, str] = {}
|
||||||
|
self.market_alias_lookup: Dict[str, str] = {}
|
||||||
|
self.parent_to_children: Dict[str, Set[str]] = {}
|
||||||
|
|
||||||
|
self.alias_index: Dict[str, Set[str]] = {}
|
||||||
|
for record in records:
|
||||||
|
for alias in record.aliases:
|
||||||
|
key = normalize_text(alias)
|
||||||
|
if not key:
|
||||||
|
continue
|
||||||
|
self.alias_index.setdefault(key, set()).add(record.id)
|
||||||
|
|
||||||
|
manufacturers = sorted({record.manufacturer_brand for record in records})
|
||||||
|
parents = sorted({record.parent_brand for record in records})
|
||||||
|
for brand in manufacturers:
|
||||||
|
for alias in brand_aliases(brand):
|
||||||
|
key = normalize_text(alias)
|
||||||
|
if key:
|
||||||
|
self.manufacturer_alias_lookup[key] = brand
|
||||||
|
|
||||||
|
for parent in parents:
|
||||||
|
for alias in brand_aliases(parent):
|
||||||
|
key = normalize_text(alias)
|
||||||
|
if key:
|
||||||
|
self.parent_alias_lookup[key] = parent
|
||||||
|
|
||||||
|
for manufacturer in manufacturers:
|
||||||
|
parent = resolve_parent_brand(manufacturer)
|
||||||
|
self.parent_to_children.setdefault(parent, set()).add(manufacturer)
|
||||||
|
|
||||||
|
for market_brand, aliases in MARKET_BRAND_ALIASES.items():
|
||||||
|
for alias in set([market_brand, *aliases]):
|
||||||
|
key = normalize_text(alias)
|
||||||
|
if key:
|
||||||
|
self.market_alias_lookup[key] = market_brand
|
||||||
|
|
||||||
|
def _parse_brand_filter(self, input_brand: Optional[str]) -> Dict[str, Optional[str]]:
|
||||||
|
if not input_brand:
|
||||||
|
return {
|
||||||
|
"parent_brand": None,
|
||||||
|
"manufacturer_brand": None,
|
||||||
|
"market_brand": None,
|
||||||
|
"source": "none",
|
||||||
|
}
|
||||||
|
|
||||||
|
input_norm = normalize_text(input_brand)
|
||||||
|
if not input_norm:
|
||||||
|
return {
|
||||||
|
"parent_brand": None,
|
||||||
|
"manufacturer_brand": None,
|
||||||
|
"market_brand": None,
|
||||||
|
"source": "none",
|
||||||
|
}
|
||||||
|
|
||||||
|
if input_norm in self.market_alias_lookup:
|
||||||
|
market_brand = self.market_alias_lookup[input_norm]
|
||||||
|
manufacturer_brand = MARKET_BRAND_TO_MANUFACTURER.get(market_brand, market_brand)
|
||||||
|
parent_brand = resolve_parent_brand(manufacturer_brand)
|
||||||
|
if market_brand == "Xiaomi":
|
||||||
|
return {
|
||||||
|
"parent_brand": parent_brand,
|
||||||
|
"manufacturer_brand": manufacturer_brand,
|
||||||
|
"market_brand": None,
|
||||||
|
"source": "manufacturer_alias_from_market",
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
"parent_brand": parent_brand,
|
||||||
|
"manufacturer_brand": manufacturer_brand,
|
||||||
|
"market_brand": market_brand,
|
||||||
|
"source": "market_alias_exact",
|
||||||
|
}
|
||||||
|
|
||||||
|
if input_norm in self.manufacturer_alias_lookup:
|
||||||
|
manufacturer_brand = self.manufacturer_alias_lookup[input_norm]
|
||||||
|
parent_brand = resolve_parent_brand(manufacturer_brand)
|
||||||
|
children = self.parent_to_children.get(manufacturer_brand, set())
|
||||||
|
if manufacturer_brand == parent_brand and len(children) > 1:
|
||||||
|
return {
|
||||||
|
"parent_brand": parent_brand,
|
||||||
|
"manufacturer_brand": None,
|
||||||
|
"market_brand": None,
|
||||||
|
"source": "parent_alias_exact",
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
"parent_brand": parent_brand,
|
||||||
|
"manufacturer_brand": manufacturer_brand,
|
||||||
|
"market_brand": None,
|
||||||
|
"source": "manufacturer_alias_exact",
|
||||||
|
}
|
||||||
|
|
||||||
|
if input_norm in self.parent_alias_lookup:
|
||||||
|
parent_brand = self.parent_alias_lookup[input_norm]
|
||||||
|
return {
|
||||||
|
"parent_brand": parent_brand,
|
||||||
|
"manufacturer_brand": None,
|
||||||
|
"market_brand": None,
|
||||||
|
"source": "parent_alias_exact",
|
||||||
|
}
|
||||||
|
|
||||||
|
for alias_norm, market_brand in self.market_alias_lookup.items():
|
||||||
|
if alias_norm and alias_norm in input_norm:
|
||||||
|
manufacturer_brand = MARKET_BRAND_TO_MANUFACTURER.get(market_brand, market_brand)
|
||||||
|
return {
|
||||||
|
"parent_brand": resolve_parent_brand(manufacturer_brand),
|
||||||
|
"manufacturer_brand": manufacturer_brand,
|
||||||
|
"market_brand": market_brand,
|
||||||
|
"source": "market_alias_contains",
|
||||||
|
}
|
||||||
|
|
||||||
|
for alias_norm, manufacturer_brand in self.manufacturer_alias_lookup.items():
|
||||||
|
if alias_norm and alias_norm in input_norm:
|
||||||
|
parent_brand = resolve_parent_brand(manufacturer_brand)
|
||||||
|
children = self.parent_to_children.get(manufacturer_brand, set())
|
||||||
|
if manufacturer_brand == parent_brand and len(children) > 1:
|
||||||
|
return {
|
||||||
|
"parent_brand": parent_brand,
|
||||||
|
"manufacturer_brand": None,
|
||||||
|
"market_brand": None,
|
||||||
|
"source": "parent_alias_contains",
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
"parent_brand": parent_brand,
|
||||||
|
"manufacturer_brand": manufacturer_brand,
|
||||||
|
"market_brand": None,
|
||||||
|
"source": "manufacturer_alias_contains",
|
||||||
|
}
|
||||||
|
|
||||||
|
for alias_norm, parent_brand in self.parent_alias_lookup.items():
|
||||||
|
if alias_norm and alias_norm in input_norm:
|
||||||
|
return {
|
||||||
|
"parent_brand": parent_brand,
|
||||||
|
"manufacturer_brand": None,
|
||||||
|
"market_brand": None,
|
||||||
|
"source": "parent_alias_contains",
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
"parent_brand": None,
|
||||||
|
"manufacturer_brand": None,
|
||||||
|
"market_brand": None,
|
||||||
|
"source": "none",
|
||||||
|
}
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _brand_match(
|
||||||
|
brand_filter: Dict[str, Optional[str]],
|
||||||
|
record: DeviceRecord,
|
||||||
|
) -> bool:
|
||||||
|
parent = brand_filter.get("parent_brand")
|
||||||
|
manufacturer = brand_filter.get("manufacturer_brand")
|
||||||
|
market = brand_filter.get("market_brand")
|
||||||
|
|
||||||
|
if parent and record.parent_brand != parent:
|
||||||
|
return False
|
||||||
|
if manufacturer and record.manufacturer_brand != manufacturer:
|
||||||
|
return False
|
||||||
|
if market and record.market_brand != market:
|
||||||
|
return False
|
||||||
|
return True
|
||||||
|
|
||||||
|
def find(self, name: str, brand: Optional[str] = None, limit: int = 5) -> Dict[str, object]:
|
||||||
|
query = normalize_text(name)
|
||||||
|
if not query:
|
||||||
|
return {
|
||||||
|
"matched": False,
|
||||||
|
"reason": "Empty device name.",
|
||||||
|
"query_name": name,
|
||||||
|
"query_brand": brand,
|
||||||
|
"candidates": [],
|
||||||
|
}
|
||||||
|
|
||||||
|
candidate_ids = list(self.alias_index.get(query, set()))
|
||||||
|
matched_records = [self.records_by_id[rid] for rid in candidate_ids]
|
||||||
|
brand_filter = self._parse_brand_filter(brand)
|
||||||
|
|
||||||
|
if brand:
|
||||||
|
matched_records = [r for r in matched_records if self._brand_match(brand_filter, r)]
|
||||||
|
if not matched_records and brand_filter.get("manufacturer_brand"):
|
||||||
|
fallback_filter = {
|
||||||
|
"parent_brand": brand_filter.get("parent_brand"),
|
||||||
|
"manufacturer_brand": brand_filter.get("manufacturer_brand"),
|
||||||
|
"market_brand": None,
|
||||||
|
}
|
||||||
|
matched_records = [r for r in [self.records_by_id[rid] for rid in candidate_ids] if self._brand_match(fallback_filter, r)]
|
||||||
|
|
||||||
|
matched_records.sort(key=lambda r: (r.device_name, r.source_file, r.id))
|
||||||
|
|
||||||
|
if matched_records:
|
||||||
|
best = matched_records[0]
|
||||||
|
return {
|
||||||
|
"matched": True,
|
||||||
|
"query_name": name,
|
||||||
|
"query_brand": brand,
|
||||||
|
"query_brand_parsed": brand_filter,
|
||||||
|
"best": asdict(best),
|
||||||
|
"candidates": [asdict(r) for r in matched_records[:limit]],
|
||||||
|
}
|
||||||
|
|
||||||
|
suggestions: List[str] = []
|
||||||
|
for alias in self.alias_index:
|
||||||
|
if query in alias or alias in query:
|
||||||
|
suggestions.append(alias)
|
||||||
|
if len(suggestions) >= limit:
|
||||||
|
break
|
||||||
|
|
||||||
|
return {
|
||||||
|
"matched": False,
|
||||||
|
"query_name": name,
|
||||||
|
"query_brand": brand,
|
||||||
|
"query_brand_parsed": brand_filter,
|
||||||
|
"reason": "No exact alias match.",
|
||||||
|
"candidates": [],
|
||||||
|
"suggestions": suggestions,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def build_records(repo_root: Path) -> List[DeviceRecord]:
|
||||||
|
brands_dir = repo_root / "brands"
|
||||||
|
records: List[DeviceRecord] = []
|
||||||
|
|
||||||
|
for md_path in sorted(brands_dir.glob("*.md")):
|
||||||
|
records.extend(parse_brand_file(md_path))
|
||||||
|
|
||||||
|
return records
|
||||||
|
|
||||||
|
|
||||||
|
def export_index(records: List[DeviceRecord], output_path: Path) -> None:
|
||||||
|
lookup: Dict[str, List[str]] = {}
|
||||||
|
manufacturer_brands_in_data = sorted({record.manufacturer_brand for record in records})
|
||||||
|
parent_brands_in_data = sorted({record.parent_brand for record in records})
|
||||||
|
market_brands_in_data = sorted({record.market_brand for record in records})
|
||||||
|
all_brands_in_data = sorted(
|
||||||
|
set(manufacturer_brands_in_data)
|
||||||
|
| set(market_brands_in_data)
|
||||||
|
| set(MARKET_BRAND_TO_MANUFACTURER.keys())
|
||||||
|
)
|
||||||
|
manufacturer_stats = dict(sorted(Counter(record.manufacturer_brand for record in records).items()))
|
||||||
|
parent_stats = dict(sorted(Counter(record.parent_brand for record in records).items()))
|
||||||
|
market_brand_stats = dict(sorted(Counter(record.market_brand for record in records).items()))
|
||||||
|
|
||||||
|
brand_to_manufacturer = {}
|
||||||
|
for brand in all_brands_in_data:
|
||||||
|
if brand in MARKET_BRAND_TO_MANUFACTURER:
|
||||||
|
brand_to_manufacturer[brand] = MARKET_BRAND_TO_MANUFACTURER[brand]
|
||||||
|
else:
|
||||||
|
brand_to_manufacturer[brand] = resolve_parent_brand(brand)
|
||||||
|
|
||||||
|
parent_to_children: Dict[str, List[str]] = {}
|
||||||
|
for child, parent in brand_to_manufacturer.items():
|
||||||
|
parent_to_children.setdefault(parent, []).append(child)
|
||||||
|
for parent in parent_to_children:
|
||||||
|
parent_to_children[parent] = sorted(parent_to_children[parent])
|
||||||
|
|
||||||
|
all_aliases = {brand: brand_aliases(brand) for brand in all_brands_in_data}
|
||||||
|
|
||||||
|
for record in records:
|
||||||
|
for alias in record.aliases:
|
||||||
|
key = normalize_text(alias)
|
||||||
|
if not key:
|
||||||
|
continue
|
||||||
|
lookup.setdefault(key, []).append(record.id)
|
||||||
|
|
||||||
|
for key, ids in lookup.items():
|
||||||
|
lookup[key] = sorted(set(ids))
|
||||||
|
|
||||||
|
output = {
|
||||||
|
"generated_on": date.today().isoformat(),
|
||||||
|
"total_records": len(records),
|
||||||
|
"brands": manufacturer_brands_in_data,
|
||||||
|
"brand_aliases": all_aliases,
|
||||||
|
"brand_management": {
|
||||||
|
"brands": all_brands_in_data,
|
||||||
|
"manufacturers": sorted(set(brand_to_manufacturer.values())),
|
||||||
|
"manufacturer_aliases": all_aliases,
|
||||||
|
"manufacturer_to_parent": brand_to_manufacturer,
|
||||||
|
"brand_to_manufacturer": brand_to_manufacturer,
|
||||||
|
"parent_to_children": parent_to_children,
|
||||||
|
"parent_aliases": {brand: brand_aliases(brand) for brand in parent_brands_in_data},
|
||||||
|
"market_brand_aliases": MARKET_BRAND_ALIASES,
|
||||||
|
"market_brand_to_manufacturer": MARKET_BRAND_TO_MANUFACTURER,
|
||||||
|
"market_brands": market_brands_in_data,
|
||||||
|
"parent_brands": parent_brands_in_data,
|
||||||
|
"stats": {
|
||||||
|
"manufacturer_brand": manufacturer_stats,
|
||||||
|
"parent_brand": parent_stats,
|
||||||
|
"market_brand": market_brand_stats,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"lookup": lookup,
|
||||||
|
"records": [asdict(r) for r in records],
|
||||||
|
}
|
||||||
|
output_path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
output_path.write_text(json.dumps(output, ensure_ascii=False, indent=2), encoding="utf-8")
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
parser = argparse.ArgumentParser(description="MobileModels device mapper")
|
||||||
|
parser.add_argument(
|
||||||
|
"--repo-root",
|
||||||
|
type=Path,
|
||||||
|
default=WORKSPACE_ROOT,
|
||||||
|
help="Path to workspace root",
|
||||||
|
)
|
||||||
|
|
||||||
|
subparsers = parser.add_subparsers(dest="command", required=True)
|
||||||
|
|
||||||
|
build_cmd = subparsers.add_parser("build", help="Build JSON index")
|
||||||
|
build_cmd.add_argument(
|
||||||
|
"--output",
|
||||||
|
type=Path,
|
||||||
|
default=Path("dist/device_index.json"),
|
||||||
|
help="Output JSON path",
|
||||||
|
)
|
||||||
|
|
||||||
|
find_cmd = subparsers.add_parser("find", help="Find a device by name + optional brand")
|
||||||
|
find_cmd.add_argument("--name", required=True, help="Raw device name from app")
|
||||||
|
find_cmd.add_argument("--brand", default=None, help="Optional raw brand from app")
|
||||||
|
find_cmd.add_argument("--limit", type=int, default=5, help="Max matched candidates")
|
||||||
|
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
records = build_records(args.repo_root)
|
||||||
|
mapper = DeviceMapper(records)
|
||||||
|
|
||||||
|
if args.command == "build":
|
||||||
|
output_path: Path = args.output
|
||||||
|
if not output_path.is_absolute():
|
||||||
|
output_path = PROJECT_ROOT / output_path
|
||||||
|
export_index(records, output_path)
|
||||||
|
print(f"Built index: {output_path}")
|
||||||
|
print(f"Total records: {len(records)}")
|
||||||
|
return
|
||||||
|
|
||||||
|
if args.command == "find":
|
||||||
|
result = mapper.find(name=args.name, brand=args.brand, limit=args.limit)
|
||||||
|
print(json.dumps(result, ensure_ascii=False, indent=2))
|
||||||
|
return
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
@@ -0,0 +1,281 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Export MobileModels records into MySQL-friendly seed SQL."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import re
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Iterable
|
||||||
|
|
||||||
|
from device_mapper import (
|
||||||
|
MARKET_BRAND_ALIASES,
|
||||||
|
MARKET_BRAND_TO_MANUFACTURER,
|
||||||
|
build_records,
|
||||||
|
brand_aliases,
|
||||||
|
normalize_text,
|
||||||
|
resolve_parent_brand,
|
||||||
|
)
|
||||||
|
from project_layout import PROJECT_ROOT, WORKSPACE_ROOT
|
||||||
|
|
||||||
|
|
||||||
|
LEGACY_CODE_RE = re.compile(r"^[A-Za-z0-9][A-Za-z0-9,._/+\\-]{1,63}$")
|
||||||
|
|
||||||
|
|
||||||
|
def is_cn_source_file(source_file: str) -> bool:
|
||||||
|
return source_file.endswith("_cn.md")
|
||||||
|
|
||||||
|
|
||||||
|
def build_source_order(records: list[object]) -> list[str]:
|
||||||
|
source_files = sorted({record.source_file for record in records})
|
||||||
|
cn = [source for source in source_files if is_cn_source_file(source)]
|
||||||
|
other = [source for source in source_files if not is_cn_source_file(source)]
|
||||||
|
return sorted(cn) + sorted(other)
|
||||||
|
|
||||||
|
|
||||||
|
def build_source_weights(records: list[object]) -> tuple[dict[str, int], dict[str, float]]:
|
||||||
|
order = build_source_order(records)
|
||||||
|
total = len(order)
|
||||||
|
rank_map: dict[str, int] = {}
|
||||||
|
weight_map: dict[str, float] = {}
|
||||||
|
|
||||||
|
for idx, source_file in enumerate(order):
|
||||||
|
rank = idx + 1
|
||||||
|
weight = (((total - idx) / total) * 6) if total > 1 else 6
|
||||||
|
rank_map[source_file] = rank
|
||||||
|
weight_map[source_file] = round(weight, 3)
|
||||||
|
|
||||||
|
return rank_map, weight_map
|
||||||
|
|
||||||
|
|
||||||
|
def sql_quote(value: object | None) -> str:
|
||||||
|
if value is None:
|
||||||
|
return "NULL"
|
||||||
|
if isinstance(value, bool):
|
||||||
|
return "1" if value else "0"
|
||||||
|
if isinstance(value, (int, float)):
|
||||||
|
return str(value)
|
||||||
|
text = str(value)
|
||||||
|
text = text.replace("\\", "\\\\").replace("'", "\\'")
|
||||||
|
return f"'{text}'"
|
||||||
|
|
||||||
|
|
||||||
|
def batched(items: list[tuple[str, ...]], batch_size: int) -> Iterable[list[tuple[str, ...]]]:
|
||||||
|
for start in range(0, len(items), batch_size):
|
||||||
|
yield items[start:start + batch_size]
|
||||||
|
|
||||||
|
|
||||||
|
def build_catalog_rows(records: list[object]) -> list[tuple[str, ...]]:
|
||||||
|
rank_map, weight_map = build_source_weights(records)
|
||||||
|
rows = []
|
||||||
|
seen_keys: set[tuple[str, str]] = set()
|
||||||
|
for record in records:
|
||||||
|
aliases = sorted({alias.strip() for alias in record.aliases if alias.strip()})
|
||||||
|
code_aliases = [alias for alias in aliases if is_legacy_code_alias(alias)]
|
||||||
|
primary_code = code_aliases[0] if code_aliases else None
|
||||||
|
other_codes = [alias for alias in code_aliases if alias != primary_code]
|
||||||
|
code_alias = " | ".join(other_codes) if other_codes else None
|
||||||
|
version_names = [alias for alias in aliases if not is_legacy_code_alias(alias)]
|
||||||
|
ver_name = " | ".join(version_names) if version_names else None
|
||||||
|
|
||||||
|
for alias in aliases:
|
||||||
|
alias_norm = normalize_text(alias)
|
||||||
|
if not alias_norm:
|
||||||
|
continue
|
||||||
|
dedupe_key = (record.id, alias_norm)
|
||||||
|
if dedupe_key in seen_keys:
|
||||||
|
continue
|
||||||
|
seen_keys.add(dedupe_key)
|
||||||
|
rows.append((
|
||||||
|
sql_quote(record.id),
|
||||||
|
sql_quote(alias),
|
||||||
|
sql_quote(alias_norm),
|
||||||
|
sql_quote(record.device_name),
|
||||||
|
sql_quote(record.brand),
|
||||||
|
sql_quote(record.manufacturer_brand),
|
||||||
|
sql_quote(record.parent_brand),
|
||||||
|
sql_quote(record.market_brand),
|
||||||
|
sql_quote(record.device_type),
|
||||||
|
sql_quote(primary_code),
|
||||||
|
sql_quote(code_alias),
|
||||||
|
sql_quote(ver_name),
|
||||||
|
sql_quote(record.source_file),
|
||||||
|
sql_quote(record.section),
|
||||||
|
sql_quote(rank_map[record.source_file]),
|
||||||
|
sql_quote(f"{weight_map[record.source_file]:.3f}"),
|
||||||
|
))
|
||||||
|
|
||||||
|
rows.sort(key=lambda item: (item[2], item[14], item[0], item[1]))
|
||||||
|
return rows
|
||||||
|
|
||||||
|
|
||||||
|
def build_brand_rows(records: list[object]) -> list[tuple[str, ...]]:
|
||||||
|
manufacturer_brands = sorted({record.manufacturer_brand for record in records})
|
||||||
|
parent_brands = sorted({record.parent_brand for record in records})
|
||||||
|
rows: dict[tuple[str, str], tuple[str, ...]] = {}
|
||||||
|
|
||||||
|
for brand in manufacturer_brands:
|
||||||
|
parent_brand = resolve_parent_brand(brand)
|
||||||
|
for alias in brand_aliases(brand):
|
||||||
|
alias_norm = normalize_text(alias)
|
||||||
|
if not alias_norm:
|
||||||
|
continue
|
||||||
|
rows[(alias_norm, "manufacturer")] = (
|
||||||
|
sql_quote(alias_norm),
|
||||||
|
sql_quote("manufacturer"),
|
||||||
|
sql_quote(brand),
|
||||||
|
sql_quote(brand),
|
||||||
|
sql_quote(parent_brand),
|
||||||
|
sql_quote(None),
|
||||||
|
)
|
||||||
|
|
||||||
|
for brand in parent_brands:
|
||||||
|
for alias in brand_aliases(brand):
|
||||||
|
alias_norm = normalize_text(alias)
|
||||||
|
if not alias_norm:
|
||||||
|
continue
|
||||||
|
rows[(alias_norm, "parent")] = (
|
||||||
|
sql_quote(alias_norm),
|
||||||
|
sql_quote("parent"),
|
||||||
|
sql_quote(brand),
|
||||||
|
sql_quote(None),
|
||||||
|
sql_quote(brand),
|
||||||
|
sql_quote(None),
|
||||||
|
)
|
||||||
|
|
||||||
|
for market_brand, aliases in MARKET_BRAND_ALIASES.items():
|
||||||
|
manufacturer_brand = MARKET_BRAND_TO_MANUFACTURER.get(market_brand, market_brand)
|
||||||
|
parent_brand = resolve_parent_brand(manufacturer_brand)
|
||||||
|
for alias in sorted(set([market_brand, *aliases])):
|
||||||
|
alias_norm = normalize_text(alias)
|
||||||
|
if not alias_norm:
|
||||||
|
continue
|
||||||
|
rows[(alias_norm, "market")] = (
|
||||||
|
sql_quote(alias_norm),
|
||||||
|
sql_quote("market"),
|
||||||
|
sql_quote(market_brand),
|
||||||
|
sql_quote(manufacturer_brand),
|
||||||
|
sql_quote(parent_brand),
|
||||||
|
sql_quote(market_brand),
|
||||||
|
)
|
||||||
|
|
||||||
|
return [rows[key] for key in sorted(rows)]
|
||||||
|
|
||||||
|
|
||||||
|
def is_legacy_code_alias(text: str) -> bool:
|
||||||
|
value = (text or "").strip()
|
||||||
|
if not value or not LEGACY_CODE_RE.match(value):
|
||||||
|
return False
|
||||||
|
return any(ch.isdigit() for ch in value)
|
||||||
|
|
||||||
|
|
||||||
|
def append_insert_block(lines: list[str], table_name: str, columns: list[str], rows: list[tuple[str, ...]], batch_size: int = 500) -> None:
|
||||||
|
if not rows:
|
||||||
|
return
|
||||||
|
|
||||||
|
column_sql = ", ".join(f"`{column}`" for column in columns)
|
||||||
|
for chunk in batched(rows, batch_size):
|
||||||
|
values_sql = ",\n".join(f" ({', '.join(row)})" for row in chunk)
|
||||||
|
lines.append(f"INSERT INTO `{table_name}` ({column_sql}) VALUES\n{values_sql};")
|
||||||
|
lines.append("")
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
parser = argparse.ArgumentParser(description="Export MobileModels MySQL seed SQL.")
|
||||||
|
parser.add_argument(
|
||||||
|
"--repo-root",
|
||||||
|
type=Path,
|
||||||
|
default=WORKSPACE_ROOT,
|
||||||
|
help="Path to workspace root",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--output",
|
||||||
|
type=Path,
|
||||||
|
default=Path("dist/mobilemodels_mysql_seed.sql"),
|
||||||
|
help="Output SQL path",
|
||||||
|
)
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> int:
|
||||||
|
args = parse_args()
|
||||||
|
repo_root = args.repo_root.resolve()
|
||||||
|
output_path = args.output if args.output.is_absolute() else PROJECT_ROOT / args.output
|
||||||
|
|
||||||
|
records = build_records(repo_root)
|
||||||
|
device_record_count = len(records)
|
||||||
|
catalog_rows = build_catalog_rows(records)
|
||||||
|
brand_rows = build_brand_rows(records)
|
||||||
|
|
||||||
|
lines = [
|
||||||
|
"-- MobileModels MySQL seed",
|
||||||
|
"-- Generated by tools/export_mysql_seed.py",
|
||||||
|
"USE `mobilemodels`;",
|
||||||
|
"",
|
||||||
|
"START TRANSACTION;",
|
||||||
|
"",
|
||||||
|
"DELETE FROM `mm_device_catalog`;",
|
||||||
|
"DELETE FROM `mm_brand_lookup`;",
|
||||||
|
"",
|
||||||
|
]
|
||||||
|
append_insert_block(
|
||||||
|
lines,
|
||||||
|
"mm_device_catalog",
|
||||||
|
[
|
||||||
|
"record_id",
|
||||||
|
"model",
|
||||||
|
"alias_norm",
|
||||||
|
"device_name",
|
||||||
|
"brand",
|
||||||
|
"manufacturer_brand",
|
||||||
|
"parent_brand",
|
||||||
|
"market_brand",
|
||||||
|
"device_type",
|
||||||
|
"code",
|
||||||
|
"code_alias",
|
||||||
|
"ver_name",
|
||||||
|
"source_file",
|
||||||
|
"section",
|
||||||
|
"source_rank",
|
||||||
|
"source_weight",
|
||||||
|
],
|
||||||
|
catalog_rows,
|
||||||
|
)
|
||||||
|
append_insert_block(
|
||||||
|
lines,
|
||||||
|
"mm_brand_lookup",
|
||||||
|
[
|
||||||
|
"alias_norm",
|
||||||
|
"alias_type",
|
||||||
|
"canonical_brand",
|
||||||
|
"manufacturer_brand",
|
||||||
|
"parent_brand",
|
||||||
|
"market_brand",
|
||||||
|
],
|
||||||
|
brand_rows,
|
||||||
|
)
|
||||||
|
|
||||||
|
lines.extend([
|
||||||
|
"COMMIT;",
|
||||||
|
"",
|
||||||
|
f"-- device_records: {device_record_count}",
|
||||||
|
f"-- device_catalog_rows: {len(catalog_rows)}",
|
||||||
|
f"-- device_lookup_rows: {len(catalog_rows)}",
|
||||||
|
f"-- brand_lookup_rows: {len(brand_rows)}",
|
||||||
|
f"-- legacy_models_rows: {len(catalog_rows)}",
|
||||||
|
"",
|
||||||
|
])
|
||||||
|
|
||||||
|
output_path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
output_path.write_text("\n".join(lines), encoding="utf-8")
|
||||||
|
print(f"Exported MySQL seed: {output_path}")
|
||||||
|
print(f"device_records={device_record_count}")
|
||||||
|
print(f"device_catalog_rows={len(catalog_rows)}")
|
||||||
|
print(f"device_lookup_rows={len(catalog_rows)}")
|
||||||
|
print(f"brand_lookup_rows={len(brand_rows)}")
|
||||||
|
print(f"legacy_models_rows={len(catalog_rows)}")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
raise SystemExit(main())
|
||||||
@@ -0,0 +1,67 @@
|
|||||||
|
#!/bin/sh
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
APP_ROOT="${APP_ROOT:-/app}"
|
||||||
|
DATA_ROOT="${MOBILEMODELS_DATA_ROOT:-/data}"
|
||||||
|
|
||||||
|
mkdir -p "$DATA_ROOT" "$DATA_ROOT/state"
|
||||||
|
|
||||||
|
sync_missing_dir_entries() {
|
||||||
|
src_dir="$1"
|
||||||
|
dst_dir="$2"
|
||||||
|
|
||||||
|
mkdir -p "$dst_dir"
|
||||||
|
|
||||||
|
for src_entry in "$src_dir"/*; do
|
||||||
|
[ -e "$src_entry" ] || continue
|
||||||
|
name="$(basename "$src_entry")"
|
||||||
|
dst_entry="$dst_dir/$name"
|
||||||
|
|
||||||
|
if [ -d "$src_entry" ]; then
|
||||||
|
sync_missing_dir_entries "$src_entry" "$dst_entry"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ ! -e "$dst_entry" ] && [ ! -L "$dst_entry" ]; then
|
||||||
|
mkdir -p "$(dirname "$dst_entry")"
|
||||||
|
cp -a "$src_entry" "$dst_entry"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
}
|
||||||
|
|
||||||
|
init_path() {
|
||||||
|
rel_path="$1"
|
||||||
|
src_path="$APP_ROOT/$rel_path"
|
||||||
|
dst_path="$DATA_ROOT/$rel_path"
|
||||||
|
|
||||||
|
if [ -d "$src_path" ]; then
|
||||||
|
if [ ! -e "$dst_path" ] && [ ! -L "$dst_path" ]; then
|
||||||
|
mkdir -p "$(dirname "$dst_path")"
|
||||||
|
cp -a "$src_path" "$dst_path"
|
||||||
|
else
|
||||||
|
sync_missing_dir_entries "$src_path" "$dst_path"
|
||||||
|
fi
|
||||||
|
elif [ ! -e "$dst_path" ] && [ ! -L "$dst_path" ]; then
|
||||||
|
mkdir -p "$(dirname "$dst_path")"
|
||||||
|
cp -a "$src_path" "$dst_path"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ -L "$src_path" ]; then
|
||||||
|
current_target="$(readlink "$src_path" || true)"
|
||||||
|
if [ "$current_target" = "$dst_path" ]; then
|
||||||
|
return
|
||||||
|
fi
|
||||||
|
rm -f "$src_path"
|
||||||
|
else
|
||||||
|
rm -rf "$src_path"
|
||||||
|
fi
|
||||||
|
|
||||||
|
ln -s "$dst_path" "$src_path"
|
||||||
|
}
|
||||||
|
|
||||||
|
for rel_path in \
|
||||||
|
workspace \
|
||||||
|
dist
|
||||||
|
do
|
||||||
|
init_path "$rel_path"
|
||||||
|
done
|
||||||
@@ -0,0 +1,164 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Load MobileModels schema and seed data into MySQL."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import os
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
from project_layout import PROJECT_ROOT
|
||||||
|
|
||||||
|
|
||||||
|
def mysql_env(password: str) -> dict[str, str]:
|
||||||
|
env = os.environ.copy()
|
||||||
|
env["MYSQL_PWD"] = password
|
||||||
|
return env
|
||||||
|
|
||||||
|
|
||||||
|
def mysql_command(user: str, host: str, port: int, database: str | None = None) -> list[str]:
|
||||||
|
command = [
|
||||||
|
"mysql",
|
||||||
|
f"--host={host}",
|
||||||
|
f"--port={port}",
|
||||||
|
f"--user={user}",
|
||||||
|
"--protocol=TCP",
|
||||||
|
"--default-character-set=utf8mb4",
|
||||||
|
]
|
||||||
|
if database:
|
||||||
|
command.append(database)
|
||||||
|
return command
|
||||||
|
|
||||||
|
|
||||||
|
def mysqladmin_ping(user: str, password: str, host: str, port: int) -> bool:
|
||||||
|
proc = subprocess.run(
|
||||||
|
[
|
||||||
|
"mysqladmin",
|
||||||
|
f"--host={host}",
|
||||||
|
f"--port={port}",
|
||||||
|
f"--user={user}",
|
||||||
|
"--protocol=TCP",
|
||||||
|
"ping",
|
||||||
|
"--silent",
|
||||||
|
],
|
||||||
|
env=mysql_env(password),
|
||||||
|
stdout=subprocess.PIPE,
|
||||||
|
stderr=subprocess.PIPE,
|
||||||
|
text=True,
|
||||||
|
check=False,
|
||||||
|
)
|
||||||
|
return proc.returncode == 0
|
||||||
|
|
||||||
|
|
||||||
|
def wait_for_mysql(user: str, password: str, host: str, port: int, timeout: int) -> None:
|
||||||
|
deadline = time.time() + timeout
|
||||||
|
while time.time() < deadline:
|
||||||
|
if mysqladmin_ping(user, password, host, port):
|
||||||
|
return
|
||||||
|
time.sleep(2)
|
||||||
|
raise RuntimeError(f"MySQL 未在 {timeout}s 内就绪: {host}:{port}")
|
||||||
|
|
||||||
|
|
||||||
|
def run_sql_file(user: str, password: str, host: str, port: int, path: Path, database: str | None = None) -> None:
|
||||||
|
sql_text = path.read_text(encoding="utf-8")
|
||||||
|
proc = subprocess.run(
|
||||||
|
mysql_command(user, host, port, database=database),
|
||||||
|
env=mysql_env(password),
|
||||||
|
input=sql_text,
|
||||||
|
text=True,
|
||||||
|
stdout=subprocess.PIPE,
|
||||||
|
stderr=subprocess.PIPE,
|
||||||
|
check=False,
|
||||||
|
)
|
||||||
|
if proc.returncode != 0:
|
||||||
|
message = proc.stderr.strip() or proc.stdout.strip() or f"mysql exited with {proc.returncode}"
|
||||||
|
raise RuntimeError(f"执行 SQL 文件失败 {path}: {message}")
|
||||||
|
|
||||||
|
|
||||||
|
def sql_string(value: str) -> str:
|
||||||
|
return value.replace("\\", "\\\\").replace("'", "''")
|
||||||
|
|
||||||
|
|
||||||
|
def ensure_reader_user(
|
||||||
|
user: str,
|
||||||
|
password: str,
|
||||||
|
host: str,
|
||||||
|
port: int,
|
||||||
|
database: str,
|
||||||
|
reader_user: str,
|
||||||
|
reader_password: str,
|
||||||
|
) -> None:
|
||||||
|
sql = f"""
|
||||||
|
CREATE USER IF NOT EXISTS '{sql_string(reader_user)}'@'%' IDENTIFIED BY '{sql_string(reader_password)}';
|
||||||
|
ALTER USER '{sql_string(reader_user)}'@'%' IDENTIFIED BY '{sql_string(reader_password)}';
|
||||||
|
GRANT SELECT ON `{database}`.* TO '{sql_string(reader_user)}'@'%';
|
||||||
|
GRANT SELECT ON `python_services_test`.* TO '{sql_string(reader_user)}'@'%';
|
||||||
|
FLUSH PRIVILEGES;
|
||||||
|
"""
|
||||||
|
proc = subprocess.run(
|
||||||
|
mysql_command(user, host, port),
|
||||||
|
env=mysql_env(password),
|
||||||
|
input=sql,
|
||||||
|
text=True,
|
||||||
|
stdout=subprocess.PIPE,
|
||||||
|
stderr=subprocess.PIPE,
|
||||||
|
check=False,
|
||||||
|
)
|
||||||
|
if proc.returncode != 0:
|
||||||
|
message = proc.stderr.strip() or proc.stdout.strip() or f"mysql exited with {proc.returncode}"
|
||||||
|
raise RuntimeError(f"创建只读账号失败: {message}")
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
parser = argparse.ArgumentParser(description="Load MobileModels schema and seed data into MySQL.")
|
||||||
|
parser.add_argument("--schema", type=Path, default=Path("sql/mobilemodels_mysql_schema.sql"))
|
||||||
|
parser.add_argument("--seed", type=Path, default=Path("dist/mobilemodels_mysql_seed.sql"))
|
||||||
|
parser.add_argument("--host", default=os.environ.get("MYSQL_HOST", "mysql"))
|
||||||
|
parser.add_argument("--port", type=int, default=int(os.environ.get("MYSQL_PORT", "3306")))
|
||||||
|
parser.add_argument("--user", default=os.environ.get("MYSQL_ROOT_USER", "root"))
|
||||||
|
parser.add_argument("--password", default=os.environ.get("MYSQL_ROOT_PASSWORD", "mobilemodels_root"))
|
||||||
|
parser.add_argument("--database", default=os.environ.get("MYSQL_DATABASE", "mobilemodels"))
|
||||||
|
parser.add_argument("--reader-user", default=os.environ.get("MYSQL_READER_USER", ""))
|
||||||
|
parser.add_argument("--reader-password", default=os.environ.get("MYSQL_READER_PASSWORD", ""))
|
||||||
|
parser.add_argument("--wait-timeout", type=int, default=120)
|
||||||
|
parser.add_argument("--check-only", action="store_true", help="Only check MySQL readiness")
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> int:
|
||||||
|
args = parse_args()
|
||||||
|
schema_path = args.schema if args.schema.is_absolute() else PROJECT_ROOT / args.schema
|
||||||
|
seed_path = args.seed if args.seed.is_absolute() else PROJECT_ROOT / args.seed
|
||||||
|
|
||||||
|
wait_for_mysql(args.user, args.password, args.host, args.port, args.wait_timeout)
|
||||||
|
|
||||||
|
if args.check_only:
|
||||||
|
print(f"MySQL ready: {args.host}:{args.port}")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
run_sql_file(args.user, args.password, args.host, args.port, schema_path)
|
||||||
|
run_sql_file(args.user, args.password, args.host, args.port, seed_path)
|
||||||
|
|
||||||
|
if args.reader_user and args.reader_password:
|
||||||
|
ensure_reader_user(
|
||||||
|
args.user,
|
||||||
|
args.password,
|
||||||
|
args.host,
|
||||||
|
args.port,
|
||||||
|
args.database,
|
||||||
|
args.reader_user,
|
||||||
|
args.reader_password,
|
||||||
|
)
|
||||||
|
|
||||||
|
print(f"Loaded schema: {schema_path}")
|
||||||
|
print(f"Loaded seed: {seed_path}")
|
||||||
|
if args.reader_user:
|
||||||
|
print(f"Ensured reader user: {args.reader_user}")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
raise SystemExit(main())
|
||||||
@@ -0,0 +1,9 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Shared path helpers for the project layout."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
PROJECT_ROOT = Path(__file__).resolve().parent.parent
|
||||||
|
WORKSPACE_ROOT = PROJECT_ROOT / "workspace"
|
||||||
@@ -0,0 +1,170 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Sync selected upstream MobileModels data into this repository."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import filecmp
|
||||||
|
import shutil
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
import tempfile
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
from project_layout import PROJECT_ROOT, WORKSPACE_ROOT
|
||||||
|
|
||||||
|
DEFAULT_REPO_URL = "https://github.com/KHwang9883/MobileModels.git"
|
||||||
|
DEFAULT_BRANCH = "master"
|
||||||
|
SYNC_PATHS = [
|
||||||
|
"brands",
|
||||||
|
"misc",
|
||||||
|
"CHANGELOG.md",
|
||||||
|
"CHANGELOG_en.md",
|
||||||
|
"LICENSE.txt",
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
def run(cmd: list[str], cwd: Path | None = None) -> None:
|
||||||
|
subprocess.run(cmd, cwd=cwd or PROJECT_ROOT, check=True)
|
||||||
|
|
||||||
|
|
||||||
|
def remove_path(path: Path) -> None:
|
||||||
|
if path.is_dir():
|
||||||
|
shutil.rmtree(path)
|
||||||
|
elif path.exists():
|
||||||
|
path.unlink()
|
||||||
|
|
||||||
|
|
||||||
|
def sync_path(src: Path, dst: Path) -> None:
|
||||||
|
if src.is_dir():
|
||||||
|
dst.mkdir(parents=True, exist_ok=True)
|
||||||
|
source_children = {child.name for child in src.iterdir()}
|
||||||
|
|
||||||
|
for existing in dst.iterdir():
|
||||||
|
if existing.name not in source_children:
|
||||||
|
remove_path(existing)
|
||||||
|
|
||||||
|
for child in src.iterdir():
|
||||||
|
sync_path(child, dst / child.name)
|
||||||
|
return
|
||||||
|
|
||||||
|
dst.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
if dst.exists() and filecmp.cmp(src, dst, shallow=False):
|
||||||
|
return
|
||||||
|
shutil.copy2(src, dst)
|
||||||
|
|
||||||
|
|
||||||
|
def sync_selected_paths(upstream_root: Path) -> None:
|
||||||
|
for relative_path in SYNC_PATHS:
|
||||||
|
src = upstream_root / relative_path
|
||||||
|
dst = WORKSPACE_ROOT / relative_path
|
||||||
|
if not src.exists():
|
||||||
|
raise FileNotFoundError(f"Missing upstream path: {relative_path}")
|
||||||
|
sync_path(src, dst)
|
||||||
|
|
||||||
|
|
||||||
|
def build_index(output_path: str) -> None:
|
||||||
|
run(
|
||||||
|
[
|
||||||
|
sys.executable,
|
||||||
|
str(PROJECT_ROOT / "tools/device_mapper.py"),
|
||||||
|
"--repo-root",
|
||||||
|
str(WORKSPACE_ROOT),
|
||||||
|
"build",
|
||||||
|
"--output",
|
||||||
|
output_path,
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def export_mysql_seed(output_path: str) -> None:
|
||||||
|
run(
|
||||||
|
[
|
||||||
|
sys.executable,
|
||||||
|
str(PROJECT_ROOT / "tools/export_mysql_seed.py"),
|
||||||
|
"--output",
|
||||||
|
output_path,
|
||||||
|
"--repo-root",
|
||||||
|
str(WORKSPACE_ROOT),
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def load_mysql_seed(seed_path: str) -> None:
|
||||||
|
run(
|
||||||
|
[
|
||||||
|
sys.executable,
|
||||||
|
str(PROJECT_ROOT / "tools/load_mysql_seed.py"),
|
||||||
|
"--seed",
|
||||||
|
seed_path,
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description="Sync upstream MobileModels raw data and optionally rebuild the device index."
|
||||||
|
)
|
||||||
|
parser.add_argument("--repo-url", default=DEFAULT_REPO_URL, help="Upstream git repository URL")
|
||||||
|
parser.add_argument("--branch", default=DEFAULT_BRANCH, help="Upstream branch to sync from")
|
||||||
|
parser.add_argument(
|
||||||
|
"--build-index",
|
||||||
|
action="store_true",
|
||||||
|
help="Rebuild dist/device_index.json after syncing upstream data",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--index-output",
|
||||||
|
default="dist/device_index.json",
|
||||||
|
help="Output path for the rebuilt device index",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--export-mysql-seed",
|
||||||
|
action="store_true",
|
||||||
|
help="Export MySQL seed SQL after syncing upstream data",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--mysql-seed-output",
|
||||||
|
default="dist/mobilemodels_mysql_seed.sql",
|
||||||
|
help="Output path for the exported MySQL seed SQL",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--load-mysql",
|
||||||
|
action="store_true",
|
||||||
|
help="Load schema and seed data into MySQL after exporting seed SQL",
|
||||||
|
)
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> int:
|
||||||
|
args = parse_args()
|
||||||
|
|
||||||
|
with tempfile.TemporaryDirectory(prefix="mobilemodels-upstream-") as tmpdir:
|
||||||
|
upstream_root = Path(tmpdir) / "upstream"
|
||||||
|
run(
|
||||||
|
[
|
||||||
|
"git",
|
||||||
|
"clone",
|
||||||
|
"--depth",
|
||||||
|
"1",
|
||||||
|
"--branch",
|
||||||
|
args.branch,
|
||||||
|
args.repo_url,
|
||||||
|
str(upstream_root),
|
||||||
|
]
|
||||||
|
)
|
||||||
|
sync_selected_paths(upstream_root)
|
||||||
|
|
||||||
|
if args.build_index:
|
||||||
|
build_index(args.index_output)
|
||||||
|
|
||||||
|
if args.export_mysql_seed or args.load_mysql:
|
||||||
|
export_mysql_seed(args.mysql_seed_output)
|
||||||
|
|
||||||
|
if args.load_mysql:
|
||||||
|
load_mysql_seed(args.mysql_seed_output)
|
||||||
|
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
raise SystemExit(main())
|
||||||
@@ -0,0 +1,793 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Compose-facing web server for MobileModels static pages and maintenance APIs."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import subprocess
|
||||||
|
import threading
|
||||||
|
import time
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from http import HTTPStatus
|
||||||
|
from http.server import SimpleHTTPRequestHandler, ThreadingHTTPServer
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
from project_layout import PROJECT_ROOT, WORKSPACE_ROOT
|
||||||
|
from sync_upstream_mobilemodels import DEFAULT_BRANCH, DEFAULT_REPO_URL
|
||||||
|
|
||||||
|
|
||||||
|
SYNC_SCRIPT = PROJECT_ROOT / "tools/sync_upstream_mobilemodels.py"
|
||||||
|
INDEX_PATH = PROJECT_ROOT / "dist/device_index.json"
|
||||||
|
MYSQL_SEED_PATH = PROJECT_ROOT / "dist/mobilemodels_mysql_seed.sql"
|
||||||
|
MYSQL_LOADER = PROJECT_ROOT / "tools/load_mysql_seed.py"
|
||||||
|
DATA_ROOT = Path(os.environ.get("MOBILEMODELS_DATA_ROOT", "/data"))
|
||||||
|
SYNC_METADATA_PATH = DATA_ROOT / "state/sync_status.json"
|
||||||
|
SCHEDULE_CONFIG_PATH = DATA_ROOT / "state/sync_schedule.json"
|
||||||
|
MYSQL_CONFIG_PATH = DATA_ROOT / "state/mysql_settings.json"
|
||||||
|
SYNC_LOCK = threading.Lock()
|
||||||
|
SCHEDULE_LOCK = threading.Lock()
|
||||||
|
MYSQL_CONFIG_LOCK = threading.Lock()
|
||||||
|
NORMALIZE_RE = re.compile(r"[^0-9a-z\u4e00-\u9fff]+")
|
||||||
|
SCHEDULE_TIME_RE = re.compile(r"^(?:[01]?\d|2[0-3]):[0-5]\d$")
|
||||||
|
SCHEDULER_POLL_SECONDS = 20
|
||||||
|
|
||||||
|
|
||||||
|
def truthy_env(name: str, default: str = "0") -> bool:
|
||||||
|
return os.environ.get(name, default).strip().lower() in {"1", "true", "yes", "on"}
|
||||||
|
|
||||||
|
|
||||||
|
def apply_timezone_from_env() -> None:
|
||||||
|
if not os.environ.get("TZ"):
|
||||||
|
return
|
||||||
|
try:
|
||||||
|
time.tzset()
|
||||||
|
except AttributeError:
|
||||||
|
return
|
||||||
|
|
||||||
|
|
||||||
|
def mysql_auto_load_enabled() -> bool:
|
||||||
|
return bool(read_mysql_config().get("auto_load"))
|
||||||
|
|
||||||
|
|
||||||
|
def mysql_probe_credentials() -> tuple[str, str]:
|
||||||
|
reader_user = os.environ.get("MYSQL_READER_USER", "").strip()
|
||||||
|
reader_password = os.environ.get("MYSQL_READER_PASSWORD", "")
|
||||||
|
if reader_user:
|
||||||
|
return reader_user, reader_password
|
||||||
|
return (
|
||||||
|
os.environ.get("MYSQL_ROOT_USER", "root").strip() or "root",
|
||||||
|
os.environ.get("MYSQL_ROOT_PASSWORD", ""),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def local_now() -> datetime:
|
||||||
|
return datetime.now().astimezone()
|
||||||
|
|
||||||
|
|
||||||
|
def normalize_schedule_time(value: str | None, *, fallback: str = "03:00") -> str:
|
||||||
|
text = str(value or "").strip()
|
||||||
|
if not text:
|
||||||
|
text = fallback
|
||||||
|
if not SCHEDULE_TIME_RE.match(text):
|
||||||
|
if fallback and text != fallback:
|
||||||
|
return normalize_schedule_time(fallback, fallback="")
|
||||||
|
raise RuntimeError("每日同步时间格式必须为 HH:MM,例如 03:00。")
|
||||||
|
hour, minute = text.split(":", 1)
|
||||||
|
return f"{int(hour):02d}:{int(minute):02d}"
|
||||||
|
|
||||||
|
|
||||||
|
def normalize_github_proxy_prefix(value: str | None) -> str:
|
||||||
|
text = str(value or "").strip()
|
||||||
|
if not text:
|
||||||
|
return ""
|
||||||
|
if "://" not in text:
|
||||||
|
raise RuntimeError("GitHub 加速前缀必须包含协议,例如 https://ghfast.top/")
|
||||||
|
if not text.endswith("/"):
|
||||||
|
text = f"{text}/"
|
||||||
|
return text
|
||||||
|
|
||||||
|
|
||||||
|
def get_effective_repo_url(github_proxy_prefix: str | None = None) -> str:
|
||||||
|
prefix = normalize_github_proxy_prefix(
|
||||||
|
github_proxy_prefix if github_proxy_prefix is not None else os.environ.get("GITHUB_PROXY_PREFIX", "")
|
||||||
|
)
|
||||||
|
return f"{prefix}{DEFAULT_REPO_URL}" if prefix else DEFAULT_REPO_URL
|
||||||
|
|
||||||
|
|
||||||
|
def compute_next_run_at(daily_time: str, now: datetime | None = None) -> str:
|
||||||
|
current = now or local_now()
|
||||||
|
hour_text, minute_text = daily_time.split(":", 1)
|
||||||
|
candidate = current.replace(
|
||||||
|
hour=int(hour_text),
|
||||||
|
minute=int(minute_text),
|
||||||
|
second=0,
|
||||||
|
microsecond=0,
|
||||||
|
)
|
||||||
|
if candidate <= current:
|
||||||
|
candidate += timedelta(days=1)
|
||||||
|
return candidate.isoformat(timespec="seconds")
|
||||||
|
|
||||||
|
|
||||||
|
def default_schedule_config() -> dict[str, object]:
|
||||||
|
enabled = truthy_env("SYNC_SCHEDULE_ENABLED", "0")
|
||||||
|
daily_time = normalize_schedule_time(os.environ.get("SYNC_SCHEDULE_TIME", "03:00"))
|
||||||
|
timezone_name = os.environ.get("TZ", "UTC").strip() or "UTC"
|
||||||
|
github_proxy_prefix = normalize_github_proxy_prefix(os.environ.get("GITHUB_PROXY_PREFIX", ""))
|
||||||
|
return {
|
||||||
|
"enabled": enabled,
|
||||||
|
"daily_time": daily_time,
|
||||||
|
"timezone": timezone_name,
|
||||||
|
"github_proxy_prefix": github_proxy_prefix,
|
||||||
|
"next_run_at": compute_next_run_at(daily_time) if enabled else None,
|
||||||
|
"last_run_time": None,
|
||||||
|
"last_run_status": None,
|
||||||
|
"last_run_message": None,
|
||||||
|
"updated_at": None,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def default_mysql_config() -> dict[str, object]:
|
||||||
|
return {
|
||||||
|
"auto_load": truthy_env("MYSQL_AUTO_LOAD", "0"),
|
||||||
|
"updated_at": None,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def normalize_mysql_config(raw: dict[str, object] | None) -> dict[str, object]:
|
||||||
|
config = default_mysql_config()
|
||||||
|
if isinstance(raw, dict):
|
||||||
|
if "auto_load" in raw:
|
||||||
|
value = raw.get("auto_load")
|
||||||
|
config["auto_load"] = value if isinstance(value, bool) else str(value).strip().lower() in {"1", "true", "yes", "on"}
|
||||||
|
if raw.get("updated_at") is not None:
|
||||||
|
config["updated_at"] = raw.get("updated_at")
|
||||||
|
return config
|
||||||
|
|
||||||
|
|
||||||
|
def read_mysql_config() -> dict[str, object]:
|
||||||
|
with MYSQL_CONFIG_LOCK:
|
||||||
|
if not MYSQL_CONFIG_PATH.exists():
|
||||||
|
return normalize_mysql_config(None)
|
||||||
|
try:
|
||||||
|
payload = json.loads(MYSQL_CONFIG_PATH.read_text(encoding="utf-8"))
|
||||||
|
except Exception:
|
||||||
|
return normalize_mysql_config(None)
|
||||||
|
return normalize_mysql_config(payload if isinstance(payload, dict) else None)
|
||||||
|
|
||||||
|
|
||||||
|
def write_mysql_config(payload: dict[str, object]) -> dict[str, object]:
|
||||||
|
normalized = normalize_mysql_config(payload)
|
||||||
|
with MYSQL_CONFIG_LOCK:
|
||||||
|
MYSQL_CONFIG_PATH.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
MYSQL_CONFIG_PATH.write_text(
|
||||||
|
json.dumps(normalized, ensure_ascii=False, indent=2),
|
||||||
|
encoding="utf-8",
|
||||||
|
)
|
||||||
|
return normalized
|
||||||
|
|
||||||
|
|
||||||
|
def update_mysql_config(payload: dict[str, object]) -> dict[str, object]:
|
||||||
|
current = read_mysql_config()
|
||||||
|
auto_load_raw = payload.get("auto_load", current.get("auto_load", False))
|
||||||
|
auto_load = auto_load_raw if isinstance(auto_load_raw, bool) else str(auto_load_raw).strip().lower() in {"1", "true", "yes", "on"}
|
||||||
|
updated = {
|
||||||
|
**current,
|
||||||
|
"auto_load": auto_load,
|
||||||
|
"updated_at": local_now().isoformat(timespec="seconds"),
|
||||||
|
}
|
||||||
|
return write_mysql_config(updated)
|
||||||
|
|
||||||
|
|
||||||
|
def normalize_schedule_config(raw: dict[str, object] | None) -> dict[str, object]:
|
||||||
|
config = default_schedule_config()
|
||||||
|
if isinstance(raw, dict):
|
||||||
|
if "enabled" in raw:
|
||||||
|
value = raw.get("enabled")
|
||||||
|
config["enabled"] = value if isinstance(value, bool) else str(value).strip().lower() in {"1", "true", "yes", "on"}
|
||||||
|
if "daily_time" in raw:
|
||||||
|
try:
|
||||||
|
config["daily_time"] = normalize_schedule_time(str(raw.get("daily_time") or ""))
|
||||||
|
except RuntimeError:
|
||||||
|
config["daily_time"] = normalize_schedule_time(os.environ.get("SYNC_SCHEDULE_TIME", "03:00"))
|
||||||
|
if raw.get("timezone"):
|
||||||
|
config["timezone"] = str(raw.get("timezone")).strip() or config["timezone"]
|
||||||
|
if "github_proxy_prefix" in raw:
|
||||||
|
try:
|
||||||
|
config["github_proxy_prefix"] = normalize_github_proxy_prefix(str(raw.get("github_proxy_prefix") or ""))
|
||||||
|
except RuntimeError:
|
||||||
|
config["github_proxy_prefix"] = normalize_github_proxy_prefix(os.environ.get("GITHUB_PROXY_PREFIX", ""))
|
||||||
|
for key in ("last_run_time", "last_run_status", "last_run_message", "updated_at"):
|
||||||
|
if raw.get(key) is not None:
|
||||||
|
config[key] = raw.get(key)
|
||||||
|
next_run_at = raw.get("next_run_at")
|
||||||
|
if config["enabled"] and isinstance(next_run_at, str) and next_run_at.strip():
|
||||||
|
try:
|
||||||
|
datetime.fromisoformat(next_run_at)
|
||||||
|
config["next_run_at"] = next_run_at
|
||||||
|
except ValueError:
|
||||||
|
config["next_run_at"] = compute_next_run_at(str(config["daily_time"]))
|
||||||
|
else:
|
||||||
|
config["next_run_at"] = compute_next_run_at(str(config["daily_time"])) if config["enabled"] else None
|
||||||
|
return config
|
||||||
|
|
||||||
|
|
||||||
|
def read_schedule_config() -> dict[str, object]:
|
||||||
|
with SCHEDULE_LOCK:
|
||||||
|
if not SCHEDULE_CONFIG_PATH.exists():
|
||||||
|
return normalize_schedule_config(None)
|
||||||
|
try:
|
||||||
|
payload = json.loads(SCHEDULE_CONFIG_PATH.read_text(encoding="utf-8"))
|
||||||
|
except Exception:
|
||||||
|
return normalize_schedule_config(None)
|
||||||
|
return normalize_schedule_config(payload if isinstance(payload, dict) else None)
|
||||||
|
|
||||||
|
|
||||||
|
def write_schedule_config(payload: dict[str, object]) -> dict[str, object]:
|
||||||
|
normalized = normalize_schedule_config(payload)
|
||||||
|
with SCHEDULE_LOCK:
|
||||||
|
SCHEDULE_CONFIG_PATH.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
SCHEDULE_CONFIG_PATH.write_text(
|
||||||
|
json.dumps(normalized, ensure_ascii=False, indent=2),
|
||||||
|
encoding="utf-8",
|
||||||
|
)
|
||||||
|
return normalized
|
||||||
|
|
||||||
|
|
||||||
|
def update_schedule_config(payload: dict[str, object]) -> dict[str, object]:
|
||||||
|
current = read_schedule_config()
|
||||||
|
enabled_raw = payload.get("enabled", current.get("enabled", False))
|
||||||
|
enabled = enabled_raw if isinstance(enabled_raw, bool) else str(enabled_raw).strip().lower() in {"1", "true", "yes", "on"}
|
||||||
|
daily_time = normalize_schedule_time(str(payload.get("daily_time") or current.get("daily_time") or "03:00"))
|
||||||
|
github_proxy_prefix = normalize_github_proxy_prefix(
|
||||||
|
str(payload.get("github_proxy_prefix") if "github_proxy_prefix" in payload else current.get("github_proxy_prefix") or "")
|
||||||
|
)
|
||||||
|
updated = {
|
||||||
|
**current,
|
||||||
|
"enabled": enabled,
|
||||||
|
"daily_time": daily_time,
|
||||||
|
"timezone": os.environ.get("TZ", "UTC").strip() or "UTC",
|
||||||
|
"github_proxy_prefix": github_proxy_prefix,
|
||||||
|
"next_run_at": compute_next_run_at(daily_time) if enabled else None,
|
||||||
|
"updated_at": local_now().isoformat(timespec="seconds"),
|
||||||
|
}
|
||||||
|
return write_schedule_config(updated)
|
||||||
|
|
||||||
|
|
||||||
|
def mark_schedule_run(status: str, message: str) -> dict[str, object]:
|
||||||
|
current = read_schedule_config()
|
||||||
|
updated = {
|
||||||
|
**current,
|
||||||
|
"last_run_time": local_now().isoformat(timespec="seconds"),
|
||||||
|
"last_run_status": status,
|
||||||
|
"last_run_message": message,
|
||||||
|
"next_run_at": compute_next_run_at(str(current.get("daily_time") or "03:00")) if current.get("enabled") else None,
|
||||||
|
"updated_at": local_now().isoformat(timespec="seconds"),
|
||||||
|
}
|
||||||
|
return write_schedule_config(updated)
|
||||||
|
|
||||||
|
|
||||||
|
def run_command(args: list[str]) -> subprocess.CompletedProcess[str]:
|
||||||
|
return subprocess.run(
|
||||||
|
args,
|
||||||
|
cwd=PROJECT_ROOT,
|
||||||
|
text=True,
|
||||||
|
capture_output=True,
|
||||||
|
check=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def sanitize_mysql_message(message: str, host: str | None = None, port: str | None = None) -> str:
|
||||||
|
lines = [line.strip() for line in str(message or "").splitlines() if line.strip()]
|
||||||
|
filtered = [line for line in lines if not line.startswith("WARNING:")]
|
||||||
|
text = "\n".join(filtered or lines)
|
||||||
|
host_text = host or os.environ.get("MYSQL_HOST", "mysql")
|
||||||
|
port_text = port or os.environ.get("MYSQL_PORT", "3306")
|
||||||
|
|
||||||
|
if "Can't connect to server on" in text or "Can't connect to MySQL server on" in text:
|
||||||
|
return f"MySQL 当前无法连接: {host_text}:{port_text}"
|
||||||
|
if "Access denied" in text:
|
||||||
|
return f"MySQL 账号或密码无效: {host_text}:{port_text}"
|
||||||
|
return text or f"MySQL 当前无法连接: {host_text}:{port_text}"
|
||||||
|
|
||||||
|
|
||||||
|
def normalize_text(text: str) -> str:
|
||||||
|
return NORMALIZE_RE.sub("", (text or "").lower())
|
||||||
|
|
||||||
|
|
||||||
|
def sql_string(value: str) -> str:
|
||||||
|
return (value or "").replace("\\", "\\\\").replace("'", "''")
|
||||||
|
|
||||||
|
|
||||||
|
def mysql_command(database: str | None = None) -> list[str]:
|
||||||
|
command = [
|
||||||
|
"mysql",
|
||||||
|
f"--host={os.environ.get('MYSQL_HOST', 'mysql')}",
|
||||||
|
f"--port={os.environ.get('MYSQL_PORT', '3306')}",
|
||||||
|
f"--user={os.environ.get('MYSQL_READER_USER', '')}",
|
||||||
|
"--protocol=TCP",
|
||||||
|
"--default-character-set=utf8mb4",
|
||||||
|
"--batch",
|
||||||
|
"--raw",
|
||||||
|
]
|
||||||
|
if database:
|
||||||
|
command.append(database)
|
||||||
|
return command
|
||||||
|
|
||||||
|
|
||||||
|
def mysql_env() -> dict[str, str]:
|
||||||
|
env = os.environ.copy()
|
||||||
|
env["MYSQL_PWD"] = os.environ.get("MYSQL_READER_PASSWORD", "")
|
||||||
|
return env
|
||||||
|
|
||||||
|
|
||||||
|
def run_mysql_query(sql: str, database: str | None = None) -> list[dict[str, str | None]]:
|
||||||
|
proc = subprocess.run(
|
||||||
|
mysql_command(database=database),
|
||||||
|
env=mysql_env(),
|
||||||
|
input=sql,
|
||||||
|
text=True,
|
||||||
|
stdout=subprocess.PIPE,
|
||||||
|
stderr=subprocess.PIPE,
|
||||||
|
check=False,
|
||||||
|
)
|
||||||
|
if proc.returncode != 0:
|
||||||
|
message = sanitize_mysql_message(
|
||||||
|
proc.stderr.strip() or proc.stdout.strip() or f"mysql exited with {proc.returncode}"
|
||||||
|
)
|
||||||
|
raise RuntimeError(message)
|
||||||
|
|
||||||
|
lines = [line for line in proc.stdout.splitlines() if line.strip()]
|
||||||
|
if not lines:
|
||||||
|
return []
|
||||||
|
|
||||||
|
headers = lines[0].split("\t")
|
||||||
|
rows: list[dict[str, str | None]] = []
|
||||||
|
for line in lines[1:]:
|
||||||
|
values = line.split("\t")
|
||||||
|
row = {}
|
||||||
|
for idx, header in enumerate(headers):
|
||||||
|
value = values[idx] if idx < len(values) else ""
|
||||||
|
row[header] = None if value == "NULL" else value
|
||||||
|
rows.append(row)
|
||||||
|
return rows
|
||||||
|
|
||||||
|
|
||||||
|
def build_sql_query_payload(payload: dict[str, object]) -> dict[str, object]:
|
||||||
|
raw_value = str(payload.get("model_raw") or payload.get("model") or "").strip()
|
||||||
|
if not raw_value:
|
||||||
|
raise RuntimeError("请填写设备标识。")
|
||||||
|
|
||||||
|
alias_norm = normalize_text(raw_value)
|
||||||
|
if not alias_norm:
|
||||||
|
raise RuntimeError("设备标识无法归一化,请检查输入。")
|
||||||
|
|
||||||
|
limit_value = payload.get("limit", 20)
|
||||||
|
try:
|
||||||
|
limit = int(limit_value)
|
||||||
|
except Exception as err:
|
||||||
|
raise RuntimeError("limit 必须是数字。") from err
|
||||||
|
limit = max(1, min(limit, 100))
|
||||||
|
|
||||||
|
sql = f"""
|
||||||
|
SELECT
|
||||||
|
model,
|
||||||
|
record_id,
|
||||||
|
alias_norm,
|
||||||
|
device_name,
|
||||||
|
brand,
|
||||||
|
manufacturer_brand,
|
||||||
|
parent_brand,
|
||||||
|
market_brand,
|
||||||
|
device_type,
|
||||||
|
source_file,
|
||||||
|
section,
|
||||||
|
source_rank,
|
||||||
|
source_weight,
|
||||||
|
code,
|
||||||
|
code_alias,
|
||||||
|
ver_name
|
||||||
|
FROM mobilemodels.mm_device_catalog
|
||||||
|
WHERE alias_norm = '{sql_string(alias_norm)}'
|
||||||
|
ORDER BY source_rank ASC, record_id ASC
|
||||||
|
LIMIT {limit};
|
||||||
|
""".strip()
|
||||||
|
|
||||||
|
rows = run_mysql_query(sql)
|
||||||
|
return {
|
||||||
|
"query_mode": "sql",
|
||||||
|
"model_raw": raw_value,
|
||||||
|
"alias_norm": alias_norm,
|
||||||
|
"limit": limit,
|
||||||
|
"sql": sql,
|
||||||
|
"rows": rows,
|
||||||
|
"row_count": len(rows),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def read_sync_metadata() -> dict[str, object]:
|
||||||
|
if not SYNC_METADATA_PATH.exists():
|
||||||
|
return {}
|
||||||
|
try:
|
||||||
|
return json.loads(SYNC_METADATA_PATH.read_text(encoding="utf-8"))
|
||||||
|
except Exception:
|
||||||
|
return {}
|
||||||
|
|
||||||
|
|
||||||
|
def write_sync_metadata(payload: dict[str, object]) -> None:
|
||||||
|
SYNC_METADATA_PATH.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
SYNC_METADATA_PATH.write_text(
|
||||||
|
json.dumps(payload, ensure_ascii=False, indent=2),
|
||||||
|
encoding="utf-8",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def get_status_payload() -> dict[str, object]:
|
||||||
|
index_mtime = None
|
||||||
|
mysql_seed_mtime = None
|
||||||
|
if INDEX_PATH.exists():
|
||||||
|
index_mtime = datetime.fromtimestamp(INDEX_PATH.stat().st_mtime).isoformat(timespec="seconds")
|
||||||
|
if MYSQL_SEED_PATH.exists():
|
||||||
|
mysql_seed_mtime = datetime.fromtimestamp(MYSQL_SEED_PATH.stat().st_mtime).isoformat(timespec="seconds")
|
||||||
|
|
||||||
|
mysql_host = os.environ.get("MYSQL_HOST", "mysql")
|
||||||
|
mysql_port = os.environ.get("MYSQL_PORT", "3306")
|
||||||
|
mysql_database = os.environ.get("MYSQL_DATABASE", "mobilemodels")
|
||||||
|
mysql_reader_user = os.environ.get("MYSQL_READER_USER", "")
|
||||||
|
mysql_reader_password = os.environ.get("MYSQL_READER_PASSWORD", "")
|
||||||
|
mysql_config = read_mysql_config()
|
||||||
|
mysql_auto_load = bool(mysql_config.get("auto_load"))
|
||||||
|
mysql_ready = False
|
||||||
|
mysql_status = ""
|
||||||
|
sync_metadata = read_sync_metadata()
|
||||||
|
schedule_config = read_schedule_config()
|
||||||
|
github_proxy_prefix = str(schedule_config.get("github_proxy_prefix") or "")
|
||||||
|
effective_repo_url = get_effective_repo_url(github_proxy_prefix)
|
||||||
|
probe_user, probe_password = mysql_probe_credentials()
|
||||||
|
mysql_proc = run_command(
|
||||||
|
[
|
||||||
|
"python3",
|
||||||
|
str(MYSQL_LOADER),
|
||||||
|
"--check-only",
|
||||||
|
"--wait-timeout",
|
||||||
|
"5",
|
||||||
|
f"--user={probe_user}",
|
||||||
|
f"--password={probe_password}",
|
||||||
|
]
|
||||||
|
)
|
||||||
|
if mysql_proc.returncode == 0:
|
||||||
|
mysql_ready = True
|
||||||
|
mysql_status = mysql_proc.stdout.strip() or "MySQL ready"
|
||||||
|
if not mysql_auto_load:
|
||||||
|
mysql_status = f"{mysql_status}; auto load disabled"
|
||||||
|
else:
|
||||||
|
failure_message = sanitize_mysql_message(
|
||||||
|
mysql_proc.stderr.strip() or mysql_proc.stdout.strip() or "MySQL unavailable",
|
||||||
|
host=mysql_host,
|
||||||
|
port=mysql_port,
|
||||||
|
)
|
||||||
|
mysql_status = failure_message if mysql_auto_load else f"{failure_message}; auto load disabled"
|
||||||
|
|
||||||
|
return {
|
||||||
|
"supports_upstream_sync": True,
|
||||||
|
"storage_mode": "docker_volume",
|
||||||
|
"project_root": str(PROJECT_ROOT),
|
||||||
|
"workspace_root": str(WORKSPACE_ROOT),
|
||||||
|
"data_root": str(DATA_ROOT),
|
||||||
|
"mysql_auto_load": mysql_auto_load,
|
||||||
|
"mysql_config_file": str(MYSQL_CONFIG_PATH.relative_to(DATA_ROOT)),
|
||||||
|
"mysql_config_updated_at": mysql_config.get("updated_at"),
|
||||||
|
"upstream_repo_url": DEFAULT_REPO_URL,
|
||||||
|
"effective_upstream_repo_url": effective_repo_url,
|
||||||
|
"upstream_branch": DEFAULT_BRANCH,
|
||||||
|
"last_sync_time": sync_metadata.get("last_sync_time"),
|
||||||
|
"last_upstream_commit": sync_metadata.get("last_upstream_commit"),
|
||||||
|
"last_sync_trigger": sync_metadata.get("last_trigger_source"),
|
||||||
|
"index_file": str(INDEX_PATH.relative_to(PROJECT_ROOT)),
|
||||||
|
"index_mtime": index_mtime,
|
||||||
|
"mysql_seed_file": str(MYSQL_SEED_PATH.relative_to(PROJECT_ROOT)),
|
||||||
|
"mysql_seed_mtime": mysql_seed_mtime,
|
||||||
|
"sync_schedule_file": str(SCHEDULE_CONFIG_PATH.relative_to(DATA_ROOT)),
|
||||||
|
"sync_schedule_enabled": schedule_config.get("enabled"),
|
||||||
|
"sync_schedule_time": schedule_config.get("daily_time"),
|
||||||
|
"sync_schedule_timezone": schedule_config.get("timezone"),
|
||||||
|
"github_proxy_prefix": github_proxy_prefix,
|
||||||
|
"sync_schedule_next_run": schedule_config.get("next_run_at"),
|
||||||
|
"sync_schedule_last_run_time": schedule_config.get("last_run_time"),
|
||||||
|
"sync_schedule_last_run_status": schedule_config.get("last_run_status"),
|
||||||
|
"sync_schedule_last_run_message": schedule_config.get("last_run_message"),
|
||||||
|
"mysql_host": mysql_host,
|
||||||
|
"mysql_port": mysql_port,
|
||||||
|
"mysql_database": mysql_database,
|
||||||
|
"mysql_reader_user": mysql_reader_user,
|
||||||
|
"mysql_reader_password": mysql_reader_password,
|
||||||
|
"mysql_ready": mysql_ready,
|
||||||
|
"mysql_status": mysql_status,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def run_upstream_sync(trigger_source: str = "manual") -> dict[str, object]:
|
||||||
|
if not SYNC_LOCK.acquire(blocking=False):
|
||||||
|
raise RuntimeError("已有同步任务在执行,请稍后再试。")
|
||||||
|
|
||||||
|
try:
|
||||||
|
schedule_config = read_schedule_config()
|
||||||
|
github_proxy_prefix = str(schedule_config.get("github_proxy_prefix") or "")
|
||||||
|
effective_repo_url = get_effective_repo_url(github_proxy_prefix)
|
||||||
|
upstream_proc = run_command(
|
||||||
|
["git", "ls-remote", effective_repo_url, f"refs/heads/{DEFAULT_BRANCH}"]
|
||||||
|
)
|
||||||
|
upstream_commit = ""
|
||||||
|
if upstream_proc.returncode == 0 and upstream_proc.stdout.strip():
|
||||||
|
upstream_commit = upstream_proc.stdout.split()[0]
|
||||||
|
|
||||||
|
command = [
|
||||||
|
"python3",
|
||||||
|
str(SYNC_SCRIPT),
|
||||||
|
f"--repo-url={effective_repo_url}",
|
||||||
|
"--build-index",
|
||||||
|
"--export-mysql-seed",
|
||||||
|
]
|
||||||
|
if mysql_auto_load_enabled():
|
||||||
|
command.append("--load-mysql")
|
||||||
|
proc = run_command(command)
|
||||||
|
output = "\n".join(
|
||||||
|
part for part in [proc.stdout.strip(), proc.stderr.strip()] if part
|
||||||
|
).strip()
|
||||||
|
|
||||||
|
if proc.returncode != 0:
|
||||||
|
raise RuntimeError(output or f"sync script failed with exit code {proc.returncode}")
|
||||||
|
|
||||||
|
payload = {
|
||||||
|
"storage_mode": "docker_volume",
|
||||||
|
"project_root": str(PROJECT_ROOT),
|
||||||
|
"workspace_root": str(WORKSPACE_ROOT),
|
||||||
|
"data_root": str(DATA_ROOT),
|
||||||
|
"upstream_repo_url": DEFAULT_REPO_URL,
|
||||||
|
"effective_upstream_repo_url": effective_repo_url,
|
||||||
|
"github_proxy_prefix": github_proxy_prefix,
|
||||||
|
"upstream_branch": DEFAULT_BRANCH,
|
||||||
|
"upstream_commit": upstream_commit,
|
||||||
|
"trigger_source": trigger_source,
|
||||||
|
"last_sync_time": datetime.now().isoformat(timespec="seconds"),
|
||||||
|
"last_upstream_commit": upstream_commit,
|
||||||
|
"index_file": str(INDEX_PATH.relative_to(PROJECT_ROOT)),
|
||||||
|
"index_mtime": datetime.fromtimestamp(INDEX_PATH.stat().st_mtime).isoformat(timespec="seconds")
|
||||||
|
if INDEX_PATH.exists()
|
||||||
|
else None,
|
||||||
|
"mysql_seed_file": str(MYSQL_SEED_PATH.relative_to(PROJECT_ROOT)),
|
||||||
|
"mysql_seed_mtime": datetime.fromtimestamp(MYSQL_SEED_PATH.stat().st_mtime).isoformat(timespec="seconds")
|
||||||
|
if MYSQL_SEED_PATH.exists()
|
||||||
|
else None,
|
||||||
|
"output": output or "同步脚本执行完成。",
|
||||||
|
}
|
||||||
|
write_sync_metadata({
|
||||||
|
"last_sync_time": payload["last_sync_time"],
|
||||||
|
"last_upstream_commit": payload["last_upstream_commit"],
|
||||||
|
"last_trigger_source": trigger_source,
|
||||||
|
"upstream_repo_url": DEFAULT_REPO_URL,
|
||||||
|
"effective_upstream_repo_url": effective_repo_url,
|
||||||
|
"github_proxy_prefix": github_proxy_prefix,
|
||||||
|
"upstream_branch": DEFAULT_BRANCH,
|
||||||
|
})
|
||||||
|
return payload
|
||||||
|
finally:
|
||||||
|
SYNC_LOCK.release()
|
||||||
|
|
||||||
|
|
||||||
|
def run_mysql_init(trigger_source: str = "manual") -> dict[str, object]:
|
||||||
|
if not SYNC_LOCK.acquire(blocking=False):
|
||||||
|
raise RuntimeError("已有同步或 MySQL 初始化任务在执行,请稍后再试。")
|
||||||
|
|
||||||
|
try:
|
||||||
|
proc = run_command(["python3", str(MYSQL_LOADER)])
|
||||||
|
output = "\n".join(
|
||||||
|
part for part in [proc.stdout.strip(), proc.stderr.strip()] if part
|
||||||
|
).strip()
|
||||||
|
if proc.returncode != 0:
|
||||||
|
raise RuntimeError(output or f"mysql load failed with exit code {proc.returncode}")
|
||||||
|
|
||||||
|
payload = get_status_payload()
|
||||||
|
payload.update(
|
||||||
|
{
|
||||||
|
"trigger_source": trigger_source,
|
||||||
|
"output": output or "MySQL 初始化完成。",
|
||||||
|
}
|
||||||
|
)
|
||||||
|
return payload
|
||||||
|
finally:
|
||||||
|
SYNC_LOCK.release()
|
||||||
|
|
||||||
|
|
||||||
|
def run_scheduled_sync_if_due() -> None:
|
||||||
|
schedule_config = read_schedule_config()
|
||||||
|
if not schedule_config.get("enabled"):
|
||||||
|
return
|
||||||
|
|
||||||
|
next_run_at = str(schedule_config.get("next_run_at") or "").strip()
|
||||||
|
if not next_run_at:
|
||||||
|
write_schedule_config({
|
||||||
|
**schedule_config,
|
||||||
|
"next_run_at": compute_next_run_at(str(schedule_config.get("daily_time") or "03:00")),
|
||||||
|
})
|
||||||
|
return
|
||||||
|
|
||||||
|
try:
|
||||||
|
next_run_dt = datetime.fromisoformat(next_run_at)
|
||||||
|
except ValueError:
|
||||||
|
write_schedule_config({
|
||||||
|
**schedule_config,
|
||||||
|
"next_run_at": compute_next_run_at(str(schedule_config.get("daily_time") or "03:00")),
|
||||||
|
})
|
||||||
|
return
|
||||||
|
|
||||||
|
if local_now() < next_run_dt:
|
||||||
|
return
|
||||||
|
|
||||||
|
try:
|
||||||
|
payload = run_upstream_sync(trigger_source="schedule")
|
||||||
|
message = str(payload.get("output") or "定时同步完成。")
|
||||||
|
mark_schedule_run("success", message)
|
||||||
|
print(f"[scheduler] upstream sync completed at {local_now().isoformat(timespec='seconds')}")
|
||||||
|
except RuntimeError as err:
|
||||||
|
status = "skipped" if "已有同步任务" in str(err) else "failed"
|
||||||
|
mark_schedule_run(status, str(err))
|
||||||
|
print(f"[scheduler] upstream sync {status}: {err}")
|
||||||
|
except Exception as err:
|
||||||
|
mark_schedule_run("failed", str(err))
|
||||||
|
print(f"[scheduler] upstream sync failed: {err}")
|
||||||
|
|
||||||
|
|
||||||
|
def scheduler_loop() -> None:
|
||||||
|
while True:
|
||||||
|
try:
|
||||||
|
run_scheduled_sync_if_due()
|
||||||
|
except Exception as err:
|
||||||
|
print(f"[scheduler] loop error: {err}")
|
||||||
|
time.sleep(SCHEDULER_POLL_SECONDS)
|
||||||
|
|
||||||
|
|
||||||
|
class MobileModelsHandler(SimpleHTTPRequestHandler):
|
||||||
|
def __init__(self, *args, **kwargs):
|
||||||
|
super().__init__(*args, directory=str(PROJECT_ROOT), **kwargs)
|
||||||
|
|
||||||
|
def guess_type(self, path: str) -> str:
|
||||||
|
content_type = super().guess_type(path)
|
||||||
|
lower_path = path.lower()
|
||||||
|
if lower_path.endswith(".md"):
|
||||||
|
return "text/markdown; charset=utf-8"
|
||||||
|
if lower_path.endswith(".txt"):
|
||||||
|
return "text/plain; charset=utf-8"
|
||||||
|
if content_type.startswith("text/") and "charset=" not in content_type:
|
||||||
|
return f"{content_type}; charset=utf-8"
|
||||||
|
return content_type
|
||||||
|
|
||||||
|
def _send_json(self, payload: dict[str, object], status: int = HTTPStatus.OK) -> None:
|
||||||
|
data = json.dumps(payload, ensure_ascii=False).encode("utf-8")
|
||||||
|
self.send_response(status)
|
||||||
|
self.send_header("Content-Type", "application/json; charset=utf-8")
|
||||||
|
self.send_header("Content-Length", str(len(data)))
|
||||||
|
self.send_header("Cache-Control", "no-store")
|
||||||
|
self.end_headers()
|
||||||
|
self.wfile.write(data)
|
||||||
|
|
||||||
|
def do_GET(self) -> None:
|
||||||
|
if self.path == "/api/status":
|
||||||
|
try:
|
||||||
|
self._send_json(get_status_payload())
|
||||||
|
except Exception as err:
|
||||||
|
self._send_json({"error": str(err)}, status=HTTPStatus.INTERNAL_SERVER_ERROR)
|
||||||
|
return
|
||||||
|
return super().do_GET()
|
||||||
|
|
||||||
|
def do_POST(self) -> None:
|
||||||
|
if self.path == "/api/sync-upstream":
|
||||||
|
try:
|
||||||
|
payload = run_upstream_sync()
|
||||||
|
self._send_json(payload)
|
||||||
|
except RuntimeError as err:
|
||||||
|
status = HTTPStatus.CONFLICT if "已有同步任务" in str(err) else HTTPStatus.INTERNAL_SERVER_ERROR
|
||||||
|
self._send_json({"error": str(err)}, status=status)
|
||||||
|
except Exception as err:
|
||||||
|
self._send_json({"error": str(err)}, status=HTTPStatus.INTERNAL_SERVER_ERROR)
|
||||||
|
return
|
||||||
|
if self.path == "/api/init-mysql":
|
||||||
|
try:
|
||||||
|
payload = run_mysql_init()
|
||||||
|
self._send_json(payload)
|
||||||
|
except RuntimeError as err:
|
||||||
|
status = HTTPStatus.CONFLICT if "已有同步或 MySQL 初始化任务" in str(err) else HTTPStatus.INTERNAL_SERVER_ERROR
|
||||||
|
self._send_json({"error": str(err)}, status=status)
|
||||||
|
except Exception as err:
|
||||||
|
self._send_json({"error": str(err)}, status=HTTPStatus.INTERNAL_SERVER_ERROR)
|
||||||
|
return
|
||||||
|
if self.path == "/api/query-sql":
|
||||||
|
try:
|
||||||
|
content_length = int(self.headers.get("Content-Length", "0") or "0")
|
||||||
|
raw_body = self.rfile.read(content_length) if content_length > 0 else b"{}"
|
||||||
|
req = json.loads(raw_body.decode("utf-8") or "{}")
|
||||||
|
payload = build_sql_query_payload(req if isinstance(req, dict) else {})
|
||||||
|
self._send_json(payload)
|
||||||
|
except RuntimeError as err:
|
||||||
|
self._send_json({"error": str(err)}, status=HTTPStatus.BAD_REQUEST)
|
||||||
|
except Exception as err:
|
||||||
|
self._send_json({"error": str(err)}, status=HTTPStatus.INTERNAL_SERVER_ERROR)
|
||||||
|
return
|
||||||
|
if self.path == "/api/sync-schedule":
|
||||||
|
try:
|
||||||
|
content_length = int(self.headers.get("Content-Length", "0") or "0")
|
||||||
|
raw_body = self.rfile.read(content_length) if content_length > 0 else b"{}"
|
||||||
|
req = json.loads(raw_body.decode("utf-8") or "{}")
|
||||||
|
if not isinstance(req, dict):
|
||||||
|
raise RuntimeError("请求体必须是 JSON 对象。")
|
||||||
|
schedule_config = update_schedule_config(req)
|
||||||
|
self._send_json(
|
||||||
|
{
|
||||||
|
"message": "同步设置已保存。",
|
||||||
|
"sync_schedule_enabled": schedule_config.get("enabled"),
|
||||||
|
"sync_schedule_time": schedule_config.get("daily_time"),
|
||||||
|
"sync_schedule_timezone": schedule_config.get("timezone"),
|
||||||
|
"github_proxy_prefix": schedule_config.get("github_proxy_prefix"),
|
||||||
|
"effective_upstream_repo_url": get_effective_repo_url(
|
||||||
|
str(schedule_config.get("github_proxy_prefix") or "")
|
||||||
|
),
|
||||||
|
"sync_schedule_next_run": schedule_config.get("next_run_at"),
|
||||||
|
"sync_schedule_last_run_time": schedule_config.get("last_run_time"),
|
||||||
|
"sync_schedule_last_run_status": schedule_config.get("last_run_status"),
|
||||||
|
"sync_schedule_last_run_message": schedule_config.get("last_run_message"),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
except RuntimeError as err:
|
||||||
|
self._send_json({"error": str(err)}, status=HTTPStatus.BAD_REQUEST)
|
||||||
|
except Exception as err:
|
||||||
|
self._send_json({"error": str(err)}, status=HTTPStatus.INTERNAL_SERVER_ERROR)
|
||||||
|
return
|
||||||
|
if self.path == "/api/mysql-settings":
|
||||||
|
try:
|
||||||
|
content_length = int(self.headers.get("Content-Length", "0") or "0")
|
||||||
|
raw_body = self.rfile.read(content_length) if content_length > 0 else b"{}"
|
||||||
|
req = json.loads(raw_body.decode("utf-8") or "{}")
|
||||||
|
if not isinstance(req, dict):
|
||||||
|
raise RuntimeError("请求体必须是 JSON 对象。")
|
||||||
|
mysql_config = update_mysql_config(req)
|
||||||
|
self._send_json(
|
||||||
|
{
|
||||||
|
"message": "MySQL 自动装载设置已保存。",
|
||||||
|
"mysql_auto_load": mysql_config.get("auto_load"),
|
||||||
|
"mysql_config_file": str(MYSQL_CONFIG_PATH.relative_to(DATA_ROOT)),
|
||||||
|
"mysql_config_updated_at": mysql_config.get("updated_at"),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
except RuntimeError as err:
|
||||||
|
self._send_json({"error": str(err)}, status=HTTPStatus.BAD_REQUEST)
|
||||||
|
except Exception as err:
|
||||||
|
self._send_json({"error": str(err)}, status=HTTPStatus.INTERNAL_SERVER_ERROR)
|
||||||
|
return
|
||||||
|
|
||||||
|
self._send_json({"error": "Not found"}, status=HTTPStatus.NOT_FOUND)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
parser = argparse.ArgumentParser(description="Run the MobileModels web server inside Docker Compose.")
|
||||||
|
parser.add_argument("--host", default="127.0.0.1", help="Bind host")
|
||||||
|
parser.add_argument("--port", type=int, default=8123, help="Bind port")
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> int:
|
||||||
|
apply_timezone_from_env()
|
||||||
|
write_schedule_config(read_schedule_config())
|
||||||
|
write_mysql_config(read_mysql_config())
|
||||||
|
args = parse_args()
|
||||||
|
scheduler = threading.Thread(target=scheduler_loop, name="sync-scheduler", daemon=True)
|
||||||
|
scheduler.start()
|
||||||
|
server = ThreadingHTTPServer((args.host, args.port), MobileModelsHandler)
|
||||||
|
print(f"Serving MobileModels on http://{args.host}:{args.port}")
|
||||||
|
server.serve_forever()
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
raise SystemExit(main())
|
||||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,181 @@
|
|||||||
|
<!doctype html>
|
||||||
|
<html lang="zh-CN">
|
||||||
|
<head>
|
||||||
|
<meta charset="UTF-8" />
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||||
|
<title>MobileModels 文档查看</title>
|
||||||
|
<style>
|
||||||
|
:root {
|
||||||
|
--bg: #f5f7fb;
|
||||||
|
--card: #ffffff;
|
||||||
|
--text: #1c2430;
|
||||||
|
--sub: #566173;
|
||||||
|
--line: #d9e0ea;
|
||||||
|
--brand: #0f6fff;
|
||||||
|
}
|
||||||
|
* { box-sizing: border-box; }
|
||||||
|
body {
|
||||||
|
margin: 0;
|
||||||
|
font-family: "PingFang SC", "Noto Sans SC", "Microsoft YaHei", sans-serif;
|
||||||
|
background: radial-gradient(circle at 0 0, #eef4ff 0, var(--bg) 40%), var(--bg);
|
||||||
|
color: var(--text);
|
||||||
|
}
|
||||||
|
.top-nav {
|
||||||
|
background: linear-gradient(180deg, #1f2a3a, #1a2431);
|
||||||
|
border-bottom: 1px solid rgba(255, 255, 255, 0.08);
|
||||||
|
}
|
||||||
|
.top-nav-inner {
|
||||||
|
max-width: 1200px;
|
||||||
|
margin: 0 auto;
|
||||||
|
padding: 0 16px;
|
||||||
|
height: 52px;
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 8px;
|
||||||
|
}
|
||||||
|
.top-nav .brand,
|
||||||
|
.top-nav .item {
|
||||||
|
color: #d6e3f7;
|
||||||
|
text-decoration: none;
|
||||||
|
font-size: 14px;
|
||||||
|
padding: 6px 10px;
|
||||||
|
border-radius: 6px;
|
||||||
|
}
|
||||||
|
.top-nav .brand {
|
||||||
|
font-weight: 700;
|
||||||
|
margin-right: 8px;
|
||||||
|
color: #f4f8ff;
|
||||||
|
}
|
||||||
|
.top-nav .item.active {
|
||||||
|
background: rgba(255, 255, 255, 0.16);
|
||||||
|
color: #ffffff;
|
||||||
|
font-weight: 600;
|
||||||
|
}
|
||||||
|
.wrap {
|
||||||
|
max-width: 1200px;
|
||||||
|
margin: 24px auto;
|
||||||
|
padding: 0 16px 32px;
|
||||||
|
display: grid;
|
||||||
|
gap: 16px;
|
||||||
|
}
|
||||||
|
.card {
|
||||||
|
background: var(--card);
|
||||||
|
border: 1px solid var(--line);
|
||||||
|
border-radius: 14px;
|
||||||
|
padding: 14px;
|
||||||
|
box-shadow: 0 6px 18px rgba(36, 56, 89, 0.06);
|
||||||
|
}
|
||||||
|
.title {
|
||||||
|
margin: 0 0 8px;
|
||||||
|
font-size: 18px;
|
||||||
|
font-weight: 700;
|
||||||
|
}
|
||||||
|
.sub {
|
||||||
|
margin: 0 0 14px;
|
||||||
|
color: var(--sub);
|
||||||
|
font-size: 13px;
|
||||||
|
line-height: 1.5;
|
||||||
|
}
|
||||||
|
.btns {
|
||||||
|
display: flex;
|
||||||
|
gap: 8px;
|
||||||
|
flex-wrap: wrap;
|
||||||
|
margin-top: 12px;
|
||||||
|
}
|
||||||
|
.btn {
|
||||||
|
display: inline-flex;
|
||||||
|
align-items: center;
|
||||||
|
padding: 9px 14px;
|
||||||
|
border-radius: 10px;
|
||||||
|
border: 1px solid #c8d6ee;
|
||||||
|
background: #f7faff;
|
||||||
|
color: #244775;
|
||||||
|
text-decoration: none;
|
||||||
|
font-size: 13px;
|
||||||
|
font-weight: 700;
|
||||||
|
}
|
||||||
|
.btn:hover {
|
||||||
|
background: #eef5ff;
|
||||||
|
}
|
||||||
|
pre {
|
||||||
|
margin: 0;
|
||||||
|
white-space: pre-wrap;
|
||||||
|
word-break: break-word;
|
||||||
|
font-size: 13px;
|
||||||
|
line-height: 1.65;
|
||||||
|
background: #f6f8fb;
|
||||||
|
border: 1px solid var(--line);
|
||||||
|
border-radius: 10px;
|
||||||
|
padding: 14px;
|
||||||
|
overflow: auto;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<nav class="top-nav">
|
||||||
|
<div class="top-nav-inner">
|
||||||
|
<a href="/web/device_query.html" class="brand">MobileModels</a>
|
||||||
|
<a href="/web/device_query.html" class="item">设备查询</a>
|
||||||
|
<a href="/web/brand_management.html" class="item">数据管理</a>
|
||||||
|
<a href="/web/device_query.html?view=docs" class="item active">相关文档</a>
|
||||||
|
</div>
|
||||||
|
</nav>
|
||||||
|
|
||||||
|
<div class="wrap">
|
||||||
|
<section class="card">
|
||||||
|
<h1 class="title" id="docTitle">文档查看</h1>
|
||||||
|
<p class="sub" id="docPath">正在加载文档...</p>
|
||||||
|
<div class="btns">
|
||||||
|
<a class="btn" href="/web/doc_viewer.html?path=/docs/mysql-query-design.md&title=MySQL%20%E8%AE%BE%E8%AE%A1%E8%AF%B4%E6%98%8E">MySQL 设计说明</a>
|
||||||
|
<a class="btn" href="/web/doc_viewer.html?path=/docs/web-ui.md&title=Web%20%E4%BD%BF%E7%94%A8%E8%AF%B4%E6%98%8E">Web 使用说明</a>
|
||||||
|
<a class="btn" href="/web/doc_viewer.html?path=/README.md&title=%E9%A1%B9%E7%9B%AE%20README">项目 README</a>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<section class="card">
|
||||||
|
<pre id="docContent">加载中...</pre>
|
||||||
|
</section>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<script>
|
||||||
|
const ALLOWED_DOCS = new Map([
|
||||||
|
["/docs/mysql-query-design.md", "MySQL 设计说明"],
|
||||||
|
["/docs/web-ui.md", "Web 使用说明"],
|
||||||
|
["/README.md", "项目 README"],
|
||||||
|
]);
|
||||||
|
|
||||||
|
async function main() {
|
||||||
|
const params = new URLSearchParams(window.location.search);
|
||||||
|
const path = params.get("path") || "/docs/mysql-query-design.md";
|
||||||
|
const title = params.get("title") || ALLOWED_DOCS.get(path) || "文档查看";
|
||||||
|
const docTitleEl = document.getElementById("docTitle");
|
||||||
|
const docPathEl = document.getElementById("docPath");
|
||||||
|
const docContentEl = document.getElementById("docContent");
|
||||||
|
|
||||||
|
if (!ALLOWED_DOCS.has(path)) {
|
||||||
|
docTitleEl.textContent = "文档不存在";
|
||||||
|
docPathEl.textContent = path;
|
||||||
|
docContentEl.textContent = "当前只允许查看预设文档。";
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
document.title = `${title} - MobileModels`;
|
||||||
|
docTitleEl.textContent = title;
|
||||||
|
docPathEl.textContent = path;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const resp = await fetch(path, { cache: "no-store" });
|
||||||
|
if (!resp.ok) {
|
||||||
|
throw new Error(`HTTP ${resp.status}`);
|
||||||
|
}
|
||||||
|
const text = await resp.text();
|
||||||
|
docContentEl.textContent = text;
|
||||||
|
} catch (err) {
|
||||||
|
docContentEl.textContent = `加载失败\n${err.message || err}`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
main();
|
||||||
|
</script>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
@@ -1,4 +1,8 @@
|
|||||||
# 更新日志
|
# 更新日志
|
||||||
|
### 2026-03-17
|
||||||
|
- `oppo_cn` 新增 OPPO Find N6。
|
||||||
|
### 2026-03-16
|
||||||
|
- `xiaomi-wear` 新增 Xiaomi Watch S5。
|
||||||
### 2026-03-12
|
### 2026-03-12
|
||||||
- `xiaomi` 新增 POCO C85x 5G。
|
- `xiaomi` 新增 POCO C85x 5G。
|
||||||
### 2026-03-11
|
### 2026-03-11
|
||||||
@@ -1,4 +1,6 @@
|
|||||||
# CHANGELOG
|
# CHANGELOG
|
||||||
|
### 2026-03-17
|
||||||
|
- `oppo_global_en` Add OPPO Find N6.
|
||||||
### 2026-03-12
|
### 2026-03-12
|
||||||
- `xiaomi_en` Add POCO C85x 5G.
|
- `xiaomi_en` Add POCO C85x 5G.
|
||||||
### 2026-03-08
|
### 2026-03-08
|
||||||
@@ -130,6 +130,12 @@
|
|||||||
|
|
||||||
`PKH120`: OPPO Find N5 卫星通信版
|
`PKH120`: OPPO Find N5 卫星通信版
|
||||||
|
|
||||||
|
**OPPO Find N6:**
|
||||||
|
|
||||||
|
`PLP110`: OPPO Find N6
|
||||||
|
|
||||||
|
`PLP120`: OPPO Find N6 卫星通信版
|
||||||
|
|
||||||
## Reno 系列
|
## Reno 系列
|
||||||
|
|
||||||
**OPPO Reno:**
|
**OPPO Reno:**
|
||||||
@@ -93,6 +93,10 @@
|
|||||||
|
|
||||||
`CPH2671`: OPPO Find N5
|
`CPH2671`: OPPO Find N5
|
||||||
|
|
||||||
|
**OPPO Find N6:**
|
||||||
|
|
||||||
|
`CPH2765`: OPPO Find N6
|
||||||
|
|
||||||
## Reno series
|
## Reno series
|
||||||
|
|
||||||
**OPPO Reno:**
|
**OPPO Reno:**
|
||||||
@@ -337,9 +341,9 @@
|
|||||||
|
|
||||||
`CPH2811`: OPPO Reno15 Pro 5G / OPPO Reno15 Pro Max 5G
|
`CPH2811`: OPPO Reno15 Pro 5G / OPPO Reno15 Pro Max 5G
|
||||||
|
|
||||||
**OPPO Reno15 F / OPPO Reno15 FS / OPPO Reno15 C / OPPO Reno15 A:**
|
**OPPO Reno15 F / OPPO Reno15 FS / OPPO Reno15c / OPPO Reno15 A:**
|
||||||
|
|
||||||
`CPH2801`: OPPO Reno15 F 5G / OPPO Reno15 FS 5G / OPPO Reno15 C 5G / OPPO Reno15 A
|
`CPH2801`: OPPO Reno15 F 5G / OPPO Reno15 FS 5G / OPPO Reno15c 5G / OPPO Reno15 A
|
||||||
|
|
||||||
## F series
|
## F series
|
||||||
|
|
||||||
@@ -5,7 +5,7 @@
|
|||||||
|
|
||||||
> Providing your device's codename is welcome. You can find it in version of system firmware, starting with `PD`.
|
> Providing your device's codename is welcome. You can find it in version of system firmware, starting with `PD`.
|
||||||
>
|
>
|
||||||
> Please report any errors by [opening an issue](https://github.com/KHwang9883/MobileModels/issues).
|
> Please report any errors through your project delivery channel.
|
||||||
|
|
||||||
## vivo X series
|
## vivo X series
|
||||||
|
|
||||||
@@ -201,6 +201,12 @@
|
|||||||
|
|
||||||
`M2412W1`: 小米腕部血压记录仪 (Xiaomi Watch H1 E)
|
`M2412W1`: 小米腕部血压记录仪 (Xiaomi Watch H1 E)
|
||||||
|
|
||||||
|
**[`P62`] Xiaomi Watch S5:**
|
||||||
|
|
||||||
|
`M2530W1`: Xiaomi Watch S5 蓝牙版
|
||||||
|
|
||||||
|
`M2517W1`: Xiaomi Watch S5 eSIM 版
|
||||||
|
|
||||||
## 小米智能眼镜
|
## 小米智能眼镜
|
||||||
|
|
||||||
**[`O95`] Xiaomi AI Glasses:**
|
**[`O95`] Xiaomi AI Glasses:**
|
||||||
Reference in New Issue
Block a user