fix(proxy): resolve merge conflict in claude adapter — combine smart URL path detection with beta=true

Merge both features from feat/smart-url-path-detection and main:
- Smart URL path detection: skip appending endpoint when base_url already ends with API path
- URL suffix preservation: preserve query/fragment from base_url via split_url_suffix
- v1 dedup: boundary-safe deduplication of /v1/v1
- ?beta=true: auto-append for /v1/messages endpoints (DuckCoding compatibility)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
YoVinchen
2026-02-20 18:09:05 +08:00
32 changed files with 1526 additions and 296 deletions

View File

@@ -61,8 +61,13 @@ Claude Code / Codex / Gemini official channels at 38% / 2% / 9% of original pric
</tr>
<tr>
<td width="180"><a href="https://crazyrouter.com/register?ref=cc-switch"><img src="assets/partners/logos/crazyrouter.jpg" alt="AICoding" width="150"></a></td>
<td>Thanks to Crazyrouter for sponsoring this project! Crazyrouter is a high-performance AI API aggregation platform — one API key for 300+ models including Claude Code, Codex, Gemini CLI, GPT, and more. All through a single OpenAI-compatible endpoint with zero code changes. Features include auto-failover, smart routing, unlimited concurrency, and global low-latency access. <a href="https://crazyrouter.com/register?ref=cc-switch">Register here</a> to get started.</td>
<td width="180"><a href="https://crazyrouter.com/register?aff=OZcm&ref=cc-switch"><img src="assets/partners/logos/crazyrouter.jpg" alt="AICoding" width="150"></a></td>
<td>Thanks to Crazyrouter for sponsoring this project! Crazyrouter is a high-performance AI API aggregation platform — one API key for 300+ models including Claude Code, Codex, Gemini CLI, and more. All models at 55% of official pricing with auto-failover, smart routing, and unlimited concurrency. Crazyrouter offers an exclusive deal for CC Switch users: register via <a href="https://crazyrouter.com/register?aff=OZcm&ref=cc-switch">this link</a> to get <strong>$2 free credit</strong> instantly, plus enter promo code `CCSWITCH` on your first top-up for an extra <strong>30% bonus credit</strong>! </td>
</tr>
<tr>
<td width="180"><a href="https://www.sssaicode.com/register?ref=DCP0SM"><img src="assets/partners/logos/sssaicode.png" alt="SSSAiCode" width="150"></a></td>
<td>Thanks to SSSAiCode for sponsoring this project! SSSAiCode is a stable and reliable API relay service, dedicated to providing stable, reliable, and affordable Claude and Codex model services, <strong>offering high cost-effective official Claude service at just ¥0.5/$ equivalent</strong>, supporting monthly and pay-as-you-go billing plans with same-day fast invoicing. SSSAiCode offers a special deal for CC Switch users: register via <a href="https://www.sssaicode.com/register?ref=DCP0SM">this link</a> to enjoy $10 extra credit on every top-up!</td>
</tr>
</table>

View File

@@ -61,8 +61,13 @@ Claude Code / Codex / Gemini 公式チャンネルが最安で元価格の 38% /
</tr>
<tr>
<td width="180"><a href="https://crazyrouter.com/register?ref=cc-switch"><img src="assets/partners/logos/crazyrouter.jpg" alt="AICoding" width="150"></a></td>
<td>Crazyrouter のご支援に感謝しますCrazyrouter は高性能 AI API アグリゲーションプラットフォームです。1 つの API キーで Claude Code、Codex、Gemini CLI、GPT など 300 以上のモデルにアクセス可能。OpenAI 互換エンドポイントでコード変更不要。自動フェイルオーバー、スマートルーティング、無制限同時接続、グローバル低遅延アクセスに対応。<a href="https://crazyrouter.com/register?ref=cc-switch">こちらから登録</a>してすぐにご利用いただけます</td>
<td width="180"><a href="https://crazyrouter.com/register?aff=OZcm&ref=cc-switch"><img src="assets/partners/logos/crazyrouter.jpg" alt="AICoding" width="150"></a></td>
<td>Crazyrouter のご支援に感謝しますCrazyrouter は高性能 AI API アグリゲーションプラットフォームです。1 つの API キーで Claude Code、Codex、Gemini CLI など 300 以上のモデルにアクセス可能。全モデルが公式価格の 55% で利用でき、自動フェイルオーバー、スマートルーティング、無制限同時接続に対応。CC Switch ユーザー向けの限定特典:<a href="https://crazyrouter.com/register?aff=OZcm&ref=cc-switch">こちらのリンク</a>から登録すると <strong>$2 の無料クレジット</strong> を即時進呈。さらに初回チャージ時にプロモコード `CCSWITCH` を入力すると <strong>30% のボーナスクレジット</strong> が追加されます</td>
</tr>
<tr>
<td width="180"><a href="https://www.sssaicode.com/register?ref=DCP0SM"><img src="assets/partners/logos/sssaicode.png" alt="SSSAiCode" width="150"></a></td>
<td>SSSAiCode のご支援に感謝しますSSSAiCode は安定性と信頼性に優れた API 中継サービスで、安定的で信頼性が高く、手頃な価格の Claude・Codex モデルサービスを提供しています。<strong>高コストパフォーマンスの公式 Claude サービスを 0.5¥/$ 換算で提供</strong>、月額制・Paygo など多様な課金方式に対応し、当日の迅速な請求書発行をサポート。CC Switch ユーザー向けの特別特典:<a href="https://www.sssaicode.com/register?ref=DCP0SM">こちらのリンク</a>から登録すると、毎回のチャージで $10 の追加ボーナスを受けられます!</td>
</tr>
</table>

View File

@@ -62,8 +62,13 @@ Claude Code / Codex / Gemini 官方渠道低至 3.8 / 0.2 / 0.9 折,充值更
</tr>
<tr>
<td width="180"><a href="https://crazyrouter.com/register?ref=cc-switch"><img src="assets/partners/logos/crazyrouter.jpg" alt="AICoding" width="150"></a></td>
<td>感谢 Crazyrouter 赞助了本项目Crazyrouter 是一个高性能 AI API 聚合平台——一个 API Key 即可访问 300+ 模型,包括 Claude Code、Codex、Gemini CLI、GPT 等。通过单一 OpenAI 兼容端点实现零代码改动接入。支持自动故障转移、智能路由无限并发和全球低延迟访问。<a href="https://crazyrouter.com/register?ref=cc-switch">点击这里注册</a>即可开始使用</td>
<td width="180"><a href="https://crazyrouter.com/register?aff=OZcm&ref=cc-switch"><img src="assets/partners/logos/crazyrouter.jpg" alt="AICoding" width="150"></a></td>
<td>感谢 Crazyrouter 赞助了本项目Crazyrouter 是一个高性能 AI API 聚合平台——一个 API Key 即可访问 300+ 模型,包括 Claude Code、Codex、Gemini CLI 等。全部模型低至官方定价的 55%支持自动故障转移、智能路由无限并发。Crazyrouter 为 CC Switch 用户提供了专属优惠:通过<a href="https://crazyrouter.com/register?aff=OZcm&ref=cc-switch">此链接</a>注册即可获得 <strong>$2 免费额度</strong>,首次充值时输入优惠码 `CCSWITCH` 还可获得额外 <strong>30% 奖励额度</strong></td>
</tr>
<tr>
<td width="180"><a href="https://www.sssaicode.com/register?ref=DCP0SM"><img src="assets/partners/logos/sssaicode.png" alt="SSSAiCode" width="150"></a></td>
<td>感谢 SSSAiCode 赞助了本项目SSSAiCode 是一家稳定可靠的API中转站致力于提供稳定、可靠、平价的Claude、CodeX模型服务<strong>提供高性价比折合0.5¥/$的官方Claude服务</strong>支持包月、Paygo多种计费方式、支持当日快速开票SSSAiCode为本软件的用户提供特别优惠使用<a href="https://www.sssaicode.com/register?ref=DCP0SM">此链接</a>注册每次充值均可享受10$的额外奖励!</td>
</tr>
</table>

Binary file not shown.

After

Width:  |  Height:  |  Size: 447 KiB

View File

@@ -57,7 +57,7 @@ url = "2.5"
auto-launch = "0.5"
once_cell = "1.21.3"
base64 = "0.22"
rusqlite = { version = "0.31", features = ["bundled", "backup"] }
rusqlite = { version = "0.31", features = ["bundled", "backup", "hooks"] }
indexmap = { version = "2", features = ["serde"] }
rust_decimal = "1.33"
uuid = { version = "1.11", features = ["v4"] }

View File

@@ -129,6 +129,20 @@ impl SkillApps {
apps.set_enabled_for(app, true);
apps
}
/// 从来源标签列表构建启用状态
///
/// 标签与 AppType::as_str() 一致时启用对应应用,
/// 其他标签(如 "agents", "cc-switch")忽略。
pub fn from_labels(labels: &[String]) -> Self {
let mut apps = Self::default();
for label in labels {
if let Ok(app) = label.parse::<AppType>() {
apps.set_enabled_for(&app, true);
}
}
apps
}
}
/// 已安装的 Skillv3.10.0+ 统一结构)
@@ -175,6 +189,8 @@ pub struct UnmanagedSkill {
pub description: Option<String>,
/// 在哪些应用目录中发现(如 ["claude", "codex"]
pub found_in: Vec<String>,
/// 发现路径(首个匹配的完整路径)
pub path: String,
}
/// MCP 服务器定义v3.7.0 统一结构)

View File

@@ -1,8 +1,6 @@
#![allow(non_snake_case)]
use serde_json::{Value, json};
use std::future::Future;
use std::sync::OnceLock;
use serde_json::{json, Value};
use tauri::State;
use crate::commands::sync_support::{
@@ -13,8 +11,9 @@ use crate::services::webdav_sync as webdav_sync_service;
use crate::settings::{self, WebDavSyncSettings};
use crate::store::AppState;
fn persist_sync_error(settings: &mut WebDavSyncSettings, error: &AppError) {
fn persist_sync_error(settings: &mut WebDavSyncSettings, error: &AppError, source: &str) {
settings.status.last_error = Some(error.to_string());
settings.status.last_error_source = Some(source.to_string());
let _ = settings::update_webdav_sync_status(settings.status.clone());
}
@@ -57,20 +56,16 @@ fn resolve_password_for_request(
incoming
}
#[cfg(test)]
fn webdav_sync_mutex() -> &'static tokio::sync::Mutex<()> {
static LOCK: OnceLock<tokio::sync::Mutex<()>> = OnceLock::new();
LOCK.get_or_init(|| tokio::sync::Mutex::new(()))
webdav_sync_service::sync_mutex()
}
async fn run_with_webdav_lock<T, Fut>(operation: Fut) -> Result<T, AppError>
where
Fut: Future<Output = Result<T, AppError>>,
Fut: std::future::Future<Output = Result<T, AppError>>,
{
let result = {
let _guard = webdav_sync_mutex().lock().await;
operation.await
};
result
webdav_sync_service::run_with_sync_lock(operation).await
}
fn map_sync_result<T, F>(result: Result<T, AppError>, on_error: F) -> Result<T, String>
@@ -112,7 +107,9 @@ pub async fn webdav_sync_upload(state: State<'_, AppState>) -> Result<Value, Str
let mut settings = require_enabled_webdav_settings()?;
let result = run_with_webdav_lock(webdav_sync_service::upload(&db, &mut settings)).await;
map_sync_result(result, |error| persist_sync_error(&mut settings, error))
map_sync_result(result, |error| {
persist_sync_error(&mut settings, error, "manual")
})
}
#[tauri::command]
@@ -120,10 +117,11 @@ pub async fn webdav_sync_download(state: State<'_, AppState>) -> Result<Value, S
let db = state.db.clone();
let db_for_sync = db.clone();
let mut settings = require_enabled_webdav_settings()?;
let _auto_sync_suppression = crate::services::webdav_auto_sync::AutoSyncSuppressionGuard::new();
let sync_result = run_with_webdav_lock(webdav_sync_service::download(&db, &mut settings)).await;
let mut result = map_sync_result(sync_result, |error| {
persist_sync_error(&mut settings, error)
persist_sync_error(&mut settings, error, "manual")
})?;
// Post-download sync is best-effort: snapshot restore has already succeeded.
@@ -179,8 +177,8 @@ mod tests {
use crate::error::AppError;
use crate::settings::{AppSettings, WebDavSyncSettings};
use serial_test::serial;
use std::sync::Arc;
use std::sync::atomic::{AtomicBool, Ordering};
use std::sync::Arc;
use std::time::Duration;
#[tokio::test]
@@ -287,6 +285,7 @@ mod tests {
persist_sync_error(
&mut current,
&crate::error::AppError::Config("boom".to_string()),
"manual",
);
let after = crate::settings::get_webdav_sync_settings().expect("read webdav settings");
@@ -304,6 +303,7 @@ mod tests {
.contains("boom"),
"status error should be updated"
);
assert_eq!(after.status.last_error_source.as_deref(), Some("manual"));
}
#[test]

View File

@@ -37,7 +37,7 @@ pub use dao::OmoGlobalConfig;
use crate::config::get_app_config_dir;
use crate::error::AppError;
use rusqlite::Connection;
use rusqlite::{hooks::Action, Connection};
use serde::Serialize;
use std::sync::Mutex;
@@ -76,6 +76,17 @@ pub struct Database {
pub(crate) conn: Mutex<Connection>,
}
fn register_db_change_hook(conn: &Connection) {
conn.update_hook(Some(
|action: Action, _database: &str, table: &str, _row_id: i64| match action {
Action::SQLITE_INSERT | Action::SQLITE_UPDATE | Action::SQLITE_DELETE => {
crate::services::webdav_auto_sync::notify_db_changed(table);
}
_ => {}
},
));
}
impl Database {
/// 初始化数据库连接并创建表
///
@@ -93,6 +104,7 @@ impl Database {
// 启用外键约束
conn.execute("PRAGMA foreign_keys = ON;", [])
.map_err(|e| AppError::Database(e.to_string()))?;
register_db_change_hook(&conn);
let db = Self {
conn: Mutex::new(conn),
@@ -111,6 +123,7 @@ impl Database {
// 启用外键约束
conn.execute("PRAGMA foreign_keys = ON;", [])
.map_err(|e| AppError::Database(e.to_string()))?;
register_db_change_hook(&conn);
let db = Self {
conn: Mutex::new(conn),

View File

@@ -702,6 +702,10 @@ pub fn run() {
}
let _tray = tray_builder.build(app)?;
crate::services::webdav_auto_sync::start_worker(
app_state.db.clone(),
app.handle().clone(),
);
// 将同一个实例注入到全局状态,避免重复创建导致的不一致
app.manage(app_state);

View File

@@ -280,7 +280,22 @@ impl ProviderAdapter for ClaudeAdapter {
// 去除重复的 /v1/v1可能由 base_url 与 endpoint 都带版本导致)
let base = dedup_v1_v1_boundary_safe(base);
format!("{base}{suffix}")
let url = format!("{base}{suffix}");
// 为 Claude 原生 /v1/messages 端点添加 ?beta=true 参数
// 这是某些上游服务(如 DuckCoding验证请求来源的关键参数
// 注意:不要为 OpenAI Chat Completions (/v1/chat/completions) 添加此参数
// 当 apiFormat="openai_chat" 时,请求会转发到 /v1/chat/completions
// 但该端点是 OpenAI 标准,不支持 ?beta=true 参数
if endpoint.contains("/v1/messages")
&& !endpoint.contains("/v1/chat/completions")
&& !endpoint.contains('?')
&& !url.contains('?')
{
format!("{url}?beta=true")
} else {
url
}
}
fn add_auth_headers(&self, request: RequestBuilder, auth: &AuthInfo) -> RequestBuilder {
@@ -493,14 +508,14 @@ mod tests {
fn test_build_url_anthropic() {
let adapter = ClaudeAdapter::new();
let url = adapter.build_url("https://api.anthropic.com", "/v1/messages");
assert_eq!(url, "https://api.anthropic.com/v1/messages");
assert_eq!(url, "https://api.anthropic.com/v1/messages?beta=true");
}
#[test]
fn test_build_url_openrouter() {
let adapter = ClaudeAdapter::new();
let url = adapter.build_url("https://openrouter.ai/api", "/v1/messages");
assert_eq!(url, "https://openrouter.ai/api/v1/messages");
assert_eq!(url, "https://openrouter.ai/api/v1/messages?beta=true");
}
#[test]
@@ -524,7 +539,7 @@ mod tests {
let adapter = ClaudeAdapter::new();
// base_url 已包含完整路径 /v1/messages不再追加
let url = adapter.build_url("https://example.com/api/v1/messages", "/v1/messages");
assert_eq!(url, "https://example.com/api/v1/messages");
assert_eq!(url, "https://example.com/api/v1/messages?beta=true");
}
#[test]
@@ -543,7 +558,7 @@ mod tests {
let adapter = ClaudeAdapter::new();
// base_url 以 /messages 结尾(无 /v1 前缀)
let url = adapter.build_url("https://example.com/api/messages", "/v1/messages");
assert_eq!(url, "https://example.com/api/messages");
assert_eq!(url, "https://example.com/api/messages?beta=true");
// base_url 以 /chat/completions 结尾(无 /v1 前缀)
let url2 = adapter.build_url(
@@ -566,7 +581,16 @@ mod tests {
// 另一个场景:/v1 + /v1/messages
let url2 = adapter.build_url("https://api.example.com/v1", "/v1/messages");
assert_eq!(url2, "https://api.example.com/v1/messages");
assert_eq!(url2, "https://api.example.com/v1/messages?beta=true");
}
#[test]
fn test_build_url_no_beta_for_openai_chat_completions() {
let adapter = ClaudeAdapter::new();
// OpenAI Chat Completions 端点不添加 ?beta=true
// 这是 Nvidia 等 apiFormat="openai_chat" 供应商使用的端点
let url = adapter.build_url("https://integrate.api.nvidia.com", "/v1/chat/completions");
assert_eq!(url, "https://integrate.api.nvidia.com/v1/chat/completions");
}
#[test]

View File

@@ -11,6 +11,7 @@ pub mod speedtest;
pub mod stream_check;
pub mod usage_stats;
pub mod webdav;
pub mod webdav_auto_sync;
pub mod webdav_sync;
pub use config::ConfigService;

View File

@@ -10,7 +10,7 @@ use chrono::{DateTime, Utc};
use serde::{Deserialize, Serialize};
use std::collections::{HashMap, HashSet};
use std::fs;
use std::path::{Path, PathBuf};
use std::path::{Component, Path, PathBuf};
use std::sync::Arc;
use tokio::time::timeout;
@@ -159,6 +159,154 @@ pub struct SkillMetadata {
pub description: Option<String>,
}
// ========== ~/.agents/ lock 文件解析 ==========
/// `~/.agents/.skill-lock.json` 文件结构
#[derive(Deserialize)]
struct AgentsLockFile {
skills: HashMap<String, AgentsLockSkill>,
}
/// lock 文件中单个 skill 的信息
#[derive(Deserialize)]
#[serde(rename_all = "camelCase")]
struct AgentsLockSkill {
source: Option<String>,
source_type: Option<String>,
source_url: Option<String>,
skill_path: Option<String>,
branch: Option<String>,
source_branch: Option<String>,
}
#[derive(Debug, Clone)]
struct LockRepoInfo {
owner: String,
repo: String,
skill_path: Option<String>,
branch: Option<String>,
}
fn normalize_optional_branch(branch: Option<String>) -> Option<String> {
branch.and_then(|b| {
let trimmed = b.trim();
if trimmed.is_empty() {
None
} else {
Some(trimmed.to_string())
}
})
}
fn parse_branch_from_source_url(source_url: Option<&str>) -> Option<String> {
let source_url = source_url?;
let source_url = source_url.trim();
if source_url.is_empty() {
return None;
}
// 支持 https://github.com/owner/repo/tree/<branch>/...
if let Some((_, after_tree)) = source_url.split_once("/tree/") {
let branch = after_tree
.split('/')
.next()
.map(str::trim)
.filter(|s| !s.is_empty())?;
return Some(branch.to_string());
}
// 支持 URL fragment: ...git#branch
if let Some((_, fragment)) = source_url.split_once('#') {
let branch = fragment
.split('&')
.next()
.map(str::trim)
.filter(|s| !s.is_empty())?;
return Some(branch.to_string());
}
// 支持 query: ...?branch=xxx / ?ref=xxx
if let Some((_, query)) = source_url.split_once('?') {
for pair in query.split('&') {
let Some((key, value)) = pair.split_once('=') else {
continue;
};
if matches!(key, "branch" | "ref") {
let branch = value.trim();
if !branch.is_empty() {
return Some(branch.to_string());
}
}
}
}
None
}
/// 获取 `~/.agents/skills/` 目录(存在时返回)
fn get_agents_skills_dir() -> Option<PathBuf> {
dirs::home_dir()
.map(|h| h.join(".agents").join("skills"))
.filter(|p| p.exists())
}
/// 解析 `~/.agents/.skill-lock.json`,返回 skill_name -> 仓库信息
fn parse_agents_lock() -> HashMap<String, LockRepoInfo> {
let path = match dirs::home_dir() {
Some(h) => h.join(".agents").join(".skill-lock.json"),
None => {
log::warn!("无法获取 HOME 目录,跳过解析 agents lock 文件");
return HashMap::new();
}
};
let content = match fs::read_to_string(&path) {
Ok(c) => c,
Err(e) => {
if e.kind() == std::io::ErrorKind::NotFound {
log::debug!("未找到 agents lock 文件: {}", path.display());
} else {
log::warn!("读取 agents lock 文件失败 ({}): {}", path.display(), e);
}
return HashMap::new();
}
};
let lock: AgentsLockFile = match serde_json::from_str(&content) {
Ok(l) => l,
Err(e) => {
log::warn!("解析 agents lock 文件失败 ({}): {}", path.display(), e);
return HashMap::new();
}
};
let parsed: HashMap<String, LockRepoInfo> = lock
.skills
.into_iter()
.filter_map(|(name, skill)| {
let source = skill.source?;
if skill.source_type.as_deref() != Some("github") {
return None;
}
let (owner, repo) = source.split_once('/')?;
let branch = normalize_optional_branch(skill.branch)
.or_else(|| normalize_optional_branch(skill.source_branch))
.or_else(|| parse_branch_from_source_url(skill.source_url.as_deref()));
Some((
name,
LockRepoInfo {
owner: owner.to_string(),
repo: repo.to_string(),
skill_path: skill.skill_path,
branch,
},
))
})
.collect();
log::info!(
"agents lock 文件解析完成,共识别 {} 个 github skill",
parsed.len()
);
parsed
}
// ========== SkillService ==========
pub struct SkillService;
@@ -275,11 +423,25 @@ impl SkillService {
) -> Result<InstalledSkill> {
let ssot_dir = Self::get_ssot_dir()?;
// 使用目录最后一段作为安装名
let install_name = Path::new(&skill.directory)
// 允许多级目录(如 a/b/c但必须是安全的相对路径。
let source_rel = Self::sanitize_skill_source_path(&skill.directory).ok_or_else(|| {
anyhow!(format_skill_error(
"INVALID_SKILL_DIRECTORY",
&[("directory", &skill.directory)],
Some("checkZipContent"),
))
})?;
// 安装目录名始终使用最后一段,避免在 SSOT 中创建多级目录。
let install_name = source_rel
.file_name()
.map(|s| s.to_string_lossy().to_string())
.unwrap_or_else(|| skill.directory.clone());
.and_then(|name| Self::sanitize_install_name(&name.to_string_lossy()))
.ok_or_else(|| {
anyhow!(format_skill_error(
"INVALID_SKILL_DIRECTORY",
&[("directory", &skill.directory)],
Some("checkZipContent"),
))
})?;
// 检查数据库中是否已有同名 directory 的 skill来自其他仓库
let existing_skills = db.get_all_installed_skills()?;
@@ -358,7 +520,7 @@ impl SkillService {
repo_branch = used_branch;
// 复制到 SSOT
let source = temp_dir.join(&skill.directory);
let source = temp_dir.join(&source_rel);
if !source.exists() {
let _ = fs::remove_dir_all(&temp_dir);
return Err(anyhow!(format_skill_error(
@@ -368,7 +530,24 @@ impl SkillService {
)));
}
Self::copy_dir_recursive(&source, &dest)?;
let canonical_temp = temp_dir.canonicalize().unwrap_or_else(|_| temp_dir.clone());
let canonical_source = source.canonicalize().map_err(|_| {
anyhow!(format_skill_error(
"SKILL_DIR_NOT_FOUND",
&[("path", &source.display().to_string())],
Some("checkRepoUrl"),
))
})?;
if !canonical_source.starts_with(&canonical_temp) || !canonical_source.is_dir() {
let _ = fs::remove_dir_all(&temp_dir);
return Err(anyhow!(format_skill_error(
"INVALID_SKILL_DIRECTORY",
&[("directory", &skill.directory)],
Some("checkZipContent"),
)));
}
Self::copy_dir_recursive(&canonical_source, &dest)?;
let _ = fs::remove_dir_all(&temp_dir);
// 使用实际下载成功的分支,避免 readme_url / repo_branch 与真实分支不一致。
@@ -449,12 +628,7 @@ impl SkillService {
.ok_or_else(|| anyhow!("Skill not found: {id}"))?;
// 从所有应用目录删除
for app in [
AppType::Claude,
AppType::Codex,
AppType::Gemini,
AppType::OpenCode,
] {
for app in AppType::all() {
let _ = Self::remove_from_app(&skill.directory, &app);
}
@@ -511,74 +685,49 @@ impl SkillService {
.map(|s| s.directory.clone())
.collect();
// 收集所有待扫描的目录及其来源标签
let mut scan_sources: Vec<(PathBuf, String)> = Vec::new();
for app in AppType::all() {
if let Ok(d) = Self::get_app_skills_dir(&app) {
scan_sources.push((d, app.as_str().to_string()));
}
}
if let Some(agents_dir) = get_agents_skills_dir() {
scan_sources.push((agents_dir, "agents".to_string()));
}
if let Ok(ssot_dir) = Self::get_ssot_dir() {
scan_sources.push((ssot_dir, "cc-switch".to_string()));
}
let mut unmanaged: HashMap<String, UnmanagedSkill> = HashMap::new();
for app in [
AppType::Claude,
AppType::Codex,
AppType::Gemini,
AppType::OpenCode,
] {
let app_dir = match Self::get_app_skills_dir(&app) {
Ok(d) => d,
for (scan_dir, label) in &scan_sources {
let entries = match fs::read_dir(scan_dir) {
Ok(e) => e,
Err(_) => continue,
};
if !app_dir.exists() {
continue;
}
for entry in fs::read_dir(&app_dir)? {
let entry = entry?;
for entry in entries.flatten() {
let path = entry.path();
if !path.is_dir() {
continue;
}
let dir_name = entry.file_name().to_string_lossy().to_string();
// 跳过隐藏目录(以 . 开头,如 .system
if dir_name.starts_with('.') {
if dir_name.starts_with('.') || managed_dirs.contains(&dir_name) {
continue;
}
// 跳过已管理的
if managed_dirs.contains(&dir_name) {
continue;
}
// 检查是否有 SKILL.md
let skill_md = path.join("SKILL.md");
let (name, description) = if skill_md.exists() {
match Self::parse_skill_metadata_static(&skill_md) {
Ok(meta) => (
meta.name.unwrap_or_else(|| dir_name.clone()),
meta.description,
),
Err(_) => (dir_name.clone(), None),
}
} else {
(dir_name.clone(), None)
};
// 添加或更新
let app_str = match app {
AppType::Claude => "claude",
AppType::Codex => "codex",
AppType::Gemini => "gemini",
AppType::OpenCode => "opencode",
AppType::OpenClaw => "openclaw",
};
let (name, description) = Self::read_skill_name_desc(&skill_md, &dir_name);
unmanaged
.entry(dir_name.clone())
.and_modify(|s| s.found_in.push(app_str.to_string()))
.and_modify(|s| s.found_in.push(label.clone()))
.or_insert(UnmanagedSkill {
directory: dir_name,
name,
description,
found_in: vec![app_str.to_string()],
found_in: vec![label.clone()],
path: path.display().to_string(),
});
}
}
@@ -594,34 +743,36 @@ impl SkillService {
directories: Vec<String>,
) -> Result<Vec<InstalledSkill>> {
let ssot_dir = Self::get_ssot_dir()?;
let agents_lock = parse_agents_lock();
let mut imported = Vec::new();
// 将 lock 文件中发现的仓库保存到 skill_repos
save_repos_from_lock(db, &agents_lock, directories.iter().map(|s| s.as_str()));
// 收集所有候选搜索目录
let mut search_sources: Vec<(PathBuf, String)> = Vec::new();
for app in AppType::all() {
if let Ok(d) = Self::get_app_skills_dir(&app) {
search_sources.push((d, app.as_str().to_string()));
}
}
if let Some(agents_dir) = get_agents_skills_dir() {
search_sources.push((agents_dir, "agents".to_string()));
}
search_sources.push((ssot_dir.clone(), "cc-switch".to_string()));
for dir_name in directories {
// 找到源目录(从任一应用目录复制)
// 在所有候选目录中查找
let mut source_path: Option<PathBuf> = None;
let mut found_in: Vec<String> = Vec::new();
for app in [
AppType::Claude,
AppType::Codex,
AppType::Gemini,
AppType::OpenCode,
] {
if let Ok(app_dir) = Self::get_app_skills_dir(&app) {
let skill_path = app_dir.join(&dir_name);
if skill_path.exists() {
if source_path.is_none() {
source_path = Some(skill_path);
}
let app_str = match app {
AppType::Claude => "claude",
AppType::Codex => "codex",
AppType::Gemini => "gemini",
AppType::OpenCode => "opencode",
AppType::OpenClaw => "openclaw",
};
found_in.push(app_str.to_string());
for (base, label) in &search_sources {
let skill_path = base.join(&dir_name);
if skill_path.exists() {
if source_path.is_none() {
source_path = Some(skill_path);
}
found_in.push(label.clone());
}
}
@@ -638,40 +789,25 @@ impl SkillService {
// 解析元数据
let skill_md = dest.join("SKILL.md");
let (name, description) = if skill_md.exists() {
match Self::parse_skill_metadata_static(&skill_md) {
Ok(meta) => (
meta.name.unwrap_or_else(|| dir_name.clone()),
meta.description,
),
Err(_) => (dir_name.clone(), None),
}
} else {
(dir_name.clone(), None)
};
let (name, description) = Self::read_skill_name_desc(&skill_md, &dir_name);
// 构建启用状态
let mut apps = SkillApps::default();
for app_str in &found_in {
match app_str.as_str() {
"claude" => apps.claude = true,
"codex" => apps.codex = true,
"gemini" => apps.gemini = true,
"opencode" => apps.opencode = true,
_ => {}
}
}
let apps = SkillApps::from_labels(&found_in);
// 从 lock 文件提取仓库信息
let (id, repo_owner, repo_name, repo_branch, readme_url) =
build_repo_info_from_lock(&agents_lock, &dir_name);
// 创建记录
let skill = InstalledSkill {
id: format!("local:{dir_name}"),
id,
name,
description,
directory: dir_name,
repo_owner: None,
repo_name: None,
repo_branch: None,
readme_url: None,
repo_owner,
repo_name,
repo_branch,
readme_url,
apps,
installed_at: chrono::Utc::now().timestamp(),
};
@@ -1055,6 +1191,79 @@ impl SkillService {
Ok(meta)
}
/// 从 SKILL.md 读取名称和描述,不存在则用目录名兜底
fn read_skill_name_desc(skill_md: &Path, fallback_name: &str) -> (String, Option<String>) {
if skill_md.exists() {
match Self::parse_skill_metadata_static(skill_md) {
Ok(meta) => (
meta.name.unwrap_or_else(|| fallback_name.to_string()),
meta.description,
),
Err(_) => (fallback_name.to_string(), None),
}
} else {
(fallback_name.to_string(), None)
}
}
/// 校验并规范化技能源路径(允许多级目录),拒绝路径穿越和绝对路径
fn sanitize_skill_source_path(raw: &str) -> Option<PathBuf> {
let trimmed = raw.trim();
if trimmed.is_empty() {
return None;
}
let mut normalized = PathBuf::new();
let mut has_component = false;
for component in Path::new(trimmed).components() {
match component {
Component::Normal(name) => {
let segment = name.to_string_lossy().trim().to_string();
if segment.is_empty() || segment == "." || segment == ".." {
return None;
}
normalized.push(segment);
has_component = true;
}
Component::CurDir
| Component::ParentDir
| Component::RootDir
| Component::Prefix(_) => {
return None;
}
}
}
has_component.then_some(normalized)
}
/// 校验并规范化安装目录名(最终落盘目录名,仅单段)
fn sanitize_install_name(raw: &str) -> Option<String> {
let trimmed = raw.trim();
if trimmed.is_empty() {
return None;
}
let path = Path::new(trimmed);
let mut components = path.components();
match (components.next(), components.next()) {
(Some(Component::Normal(name)), None) => {
let normalized = name.to_string_lossy().trim().to_string();
if normalized.is_empty()
|| normalized == "."
|| normalized == ".."
|| normalized.starts_with('.')
{
None
} else {
Some(normalized)
}
}
_ => None,
}
}
/// 去重技能列表(基于完整 key不同仓库的同名 skill 分开显示)
fn deduplicate_discoverable_skills(skills: &mut Vec<DiscoverableSkill>) {
let mut seen = HashMap::new();
@@ -1078,7 +1287,7 @@ impl SkillService {
let _ = temp_dir.keep();
let mut branches = Vec::new();
if !repo.branch.is_empty() {
if !repo.branch.is_empty() && !repo.branch.eq_ignore_ascii_case("HEAD") {
branches.push(repo.branch.as_str());
}
if !branches.contains(&"main") {
@@ -1143,9 +1352,12 @@ impl SkillService {
)));
};
// 第一遍:解压普通文件和目录,收集 symlink 条目
let mut symlinks: Vec<(PathBuf, String)> = Vec::new();
for i in 0..archive.len() {
let mut file = archive.by_index(i)?;
let file_path = file.name();
let file_path = file.name().to_string();
let relative_path =
if let Some(stripped) = file_path.strip_prefix(&format!("{root_name}/")) {
@@ -1160,7 +1372,12 @@ impl SkillService {
let outpath = dest.join(relative_path);
if file.is_dir() {
if file.is_symlink() {
// 读取 symlink 目标路径
let mut target = String::new();
std::io::Read::read_to_string(&mut file, &mut target)?;
symlinks.push((outpath, target.trim().to_string()));
} else if file.is_dir() {
fs::create_dir_all(&outpath)?;
} else {
if let Some(parent) = outpath.parent() {
@@ -1171,6 +1388,9 @@ impl SkillService {
}
}
// 第二遍:解析 symlink将目标内容复制到 symlink 位置
Self::resolve_symlinks_in_dir(dest, &symlinks)?;
Ok(())
}
@@ -1193,6 +1413,58 @@ impl SkillService {
Ok(())
}
/// 解析 ZIP 中的符号链接:将目标内容复制到 symlink 位置
///
/// GitHub ZIP 归档保留了 symlink 元数据,解压时可通过 `is_symlink()` 检测。
/// 此方法将 symlink 解析为实际文件/目录内容(而非创建真实 symlink
/// 以确保跨平台兼容且 skill 内容自包含。
fn resolve_symlinks_in_dir(base_dir: &Path, symlinks: &[(PathBuf, String)]) -> Result<()> {
// 规范化 base_dirmacOS 上 /tmp → /private/tmp需保持一致
let canonical_base = base_dir
.canonicalize()
.unwrap_or_else(|_| base_dir.to_path_buf());
for (link_path, target) in symlinks {
// 计算 symlink 的父目录,然后拼接目标的相对路径
let parent = link_path.parent().unwrap_or(base_dir);
let resolved = parent.join(target);
// 规范化路径(解析 .. 等)
let resolved = match resolved.canonicalize() {
Ok(p) => p,
Err(_) => {
log::warn!(
"Symlink 目标不存在,跳过: {} -> {}",
link_path.display(),
target
);
continue;
}
};
// 安全检查:确保目标在 base_dir 内(防止路径穿越)
if !resolved.starts_with(&canonical_base) {
log::warn!(
"Symlink 目标超出仓库范围,跳过: {} -> {}",
link_path.display(),
resolved.display()
);
continue;
}
// 复制目标内容到 symlink 位置
if resolved.is_dir() {
Self::copy_dir_recursive(&resolved, link_path)?;
} else if resolved.is_file() {
if let Some(parent) = link_path.parent() {
fs::create_dir_all(parent)?;
}
fs::copy(&resolved, link_path)?;
}
}
Ok(())
}
// ========== 从 ZIP 文件安装 ==========
/// 从本地 ZIP 文件安装 Skills
@@ -1225,13 +1497,56 @@ impl SkillService {
let ssot_dir = Self::get_ssot_dir()?;
let mut installed = Vec::new();
let existing_skills = db.get_all_installed_skills()?;
let zip_stem = zip_path
.file_stem()
.and_then(|s| s.to_str())
.map(|s| s.to_string());
for skill_dir in skill_dirs {
// 解析元数据(提前解析,用于确定安装名)
let skill_md = skill_dir.join("SKILL.md");
let meta = if skill_md.exists() {
Self::parse_skill_metadata_static(&skill_md).ok()
} else {
None
};
// 获取目录名称作为安装名
let install_name = skill_dir
.file_name()
.map(|s| s.to_string_lossy().to_string())
.unwrap_or_else(|| "unknown".to_string());
// 当 SKILL.md 在 ZIP 根目录时skill_dir == temp_dir
// file_name() 会返回临时目录名(如 .tmpDZKGpF需要回退到其他来源
let install_name = {
let dir_name = skill_dir
.file_name()
.map(|s| s.to_string_lossy().to_string())
.unwrap_or_default();
if skill_dir == temp_dir || dir_name.is_empty() || dir_name.starts_with('.') {
// SKILL.md 在根目录:优先用元数据 name否则用 ZIP 文件名
meta.as_ref()
.and_then(|m| m.name.as_deref())
.and_then(Self::sanitize_install_name)
.or_else(|| zip_stem.as_deref().and_then(Self::sanitize_install_name))
} else {
Self::sanitize_install_name(&dir_name)
.or_else(|| {
meta.as_ref()
.and_then(|m| m.name.as_deref())
.and_then(Self::sanitize_install_name)
})
.or_else(|| zip_stem.as_deref().and_then(Self::sanitize_install_name))
}
};
let install_name = match install_name {
Some(name) => name,
None => {
let _ = fs::remove_dir_all(&temp_dir);
return Err(anyhow!(format_skill_error(
"INVALID_SKILL_DIRECTORY",
&[("zip", &zip_path.display().to_string())],
Some("checkZipContent"),
)));
}
};
// 检查是否已有同名 directory 的 skill
let conflict = existing_skills
@@ -1247,18 +1562,12 @@ impl SkillService {
continue;
}
// 解析元数据
let skill_md = skill_dir.join("SKILL.md");
let (name, description) = if skill_md.exists() {
match Self::parse_skill_metadata_static(&skill_md) {
Ok(meta) => (
meta.name.unwrap_or_else(|| install_name.clone()),
meta.description,
),
Err(_) => (install_name.clone(), None),
}
} else {
(install_name.clone(), None)
let (name, description) = match meta {
Some(m) => (
m.name.unwrap_or_else(|| install_name.clone()),
m.description,
),
None => (install_name.clone(), None),
};
// 复制到 SSOT
@@ -1322,6 +1631,8 @@ impl SkillService {
let temp_path = temp_dir.path().to_path_buf();
let _ = temp_dir.keep(); // Keep the directory, we'll clean up later
let mut symlinks: Vec<(PathBuf, String)> = Vec::new();
for i in 0..archive.len() {
let mut file = archive.by_index(i)?;
let file_path = match file.enclosed_name() {
@@ -1331,7 +1642,11 @@ impl SkillService {
let outpath = temp_path.join(&file_path);
if file.is_dir() {
if file.is_symlink() {
let mut target = String::new();
std::io::Read::read_to_string(&mut file, &mut target)?;
symlinks.push((outpath, target.trim().to_string()));
} else if file.is_dir() {
fs::create_dir_all(&outpath)?;
} else {
if let Some(parent) = outpath.parent() {
@@ -1342,6 +1657,9 @@ impl SkillService {
}
}
// 解析 symlink
Self::resolve_symlinks_in_dir(&temp_path, &symlinks)?;
Ok(temp_path)
}
@@ -1414,38 +1732,109 @@ impl SkillService {
// ========== 迁移支持 ==========
/// 从 lock 文件信息构建 skill 的 ID、仓库字段和 readme URL
///
/// 返回 (id, repo_owner, repo_name, repo_branch, readme_url)
fn build_repo_info_from_lock(
lock: &HashMap<String, LockRepoInfo>,
dir_name: &str,
) -> (
String,
Option<String>,
Option<String>,
Option<String>,
Option<String>,
) {
match lock.get(dir_name) {
Some(info) => {
let branch = info.branch.clone();
let url_branch = branch.clone().unwrap_or_else(|| "HEAD".to_string());
// 优先使用 lock 文件中的 skillPath否则回退到 dir_name/SKILL.md
let fallback = format!("{dir_name}/SKILL.md");
let doc_path = info.skill_path.as_deref().unwrap_or(&fallback);
let url = Some(SkillService::build_skill_doc_url(
&info.owner,
&info.repo,
&url_branch,
doc_path,
));
(
format!("{}/{}:{dir_name}", info.owner, info.repo),
Some(info.owner.clone()),
Some(info.repo.clone()),
branch,
url,
)
}
None => (format!("local:{dir_name}"), None, None, None, None),
}
}
/// 将 lock 文件中发现的仓库保存到 skill_repos去重
fn save_repos_from_lock(
db: &Arc<Database>,
lock: &HashMap<String, LockRepoInfo>,
directories: impl Iterator<Item = impl AsRef<str>>,
) {
let existing_repos: HashSet<(String, String)> = db
.get_skill_repos()
.unwrap_or_default()
.into_iter()
.map(|r| (r.owner, r.name))
.collect();
let mut added = HashSet::new();
for dir_name in directories {
if let Some(info) = lock.get(dir_name.as_ref()) {
let key = (info.owner.clone(), info.repo.clone());
if !existing_repos.contains(&key) && added.insert(key) {
let skill_repo = SkillRepo {
owner: info.owner.clone(),
name: info.repo.clone(),
// 未知分支时使用 HEAD 语义,后续下载会回退到 main/master。
branch: info.branch.clone().unwrap_or_else(|| "HEAD".to_string()),
enabled: true,
};
if let Err(e) = db.save_skill_repo(&skill_repo) {
log::warn!("保存 skill 仓库 {}/{} 失败: {}", info.owner, info.repo, e);
} else {
log::info!(
"从 agents lock 文件发现并添加仓库: {}/{} ({})",
info.owner,
info.repo,
skill_repo.branch
);
}
}
}
}
}
/// 首次启动迁移:扫描应用目录,重建数据库
pub fn migrate_skills_to_ssot(db: &Arc<Database>) -> Result<usize> {
let ssot_dir = SkillService::get_ssot_dir()?;
let agents_lock = parse_agents_lock();
let mut discovered: HashMap<String, SkillApps> = HashMap::new();
// 扫描各应用目录
for app in [
AppType::Claude,
AppType::Codex,
AppType::Gemini,
AppType::OpenCode,
] {
for app in AppType::all() {
let app_dir = match SkillService::get_app_skills_dir(&app) {
Ok(d) => d,
Err(_) => continue,
};
if !app_dir.exists() {
continue;
}
let entries = match fs::read_dir(&app_dir) {
Ok(e) => e,
Err(_) => continue,
};
for entry in fs::read_dir(&app_dir)? {
let entry = entry?;
for entry in entries.flatten() {
let path = entry.path();
if !path.is_dir() {
continue;
}
let dir_name = entry.file_name().to_string_lossy().to_string();
// 跳过隐藏目录(以 . 开头,如 .system
if dir_name.starts_with('.') {
continue;
}
@@ -1456,7 +1845,6 @@ pub fn migrate_skills_to_ssot(db: &Arc<Database>) -> Result<usize> {
SkillService::copy_dir_recursive(&path, &ssot_path)?;
}
// 记录启用状态
discovered
.entry(dir_name)
.or_default()
@@ -1467,32 +1855,28 @@ pub fn migrate_skills_to_ssot(db: &Arc<Database>) -> Result<usize> {
// 重建数据库
db.clear_skills()?;
// 将 lock 文件中发现的仓库保存到 skill_repos
save_repos_from_lock(db, &agents_lock, discovered.keys());
let mut count = 0;
for (directory, apps) in discovered {
let ssot_path = ssot_dir.join(&directory);
let skill_md = ssot_path.join("SKILL.md");
let (name, description) = if skill_md.exists() {
match SkillService::parse_skill_metadata_static(&skill_md) {
Ok(meta) => (
meta.name.unwrap_or_else(|| directory.clone()),
meta.description,
),
Err(_) => (directory.clone(), None),
}
} else {
(directory.clone(), None)
};
let (name, description) = SkillService::read_skill_name_desc(&skill_md, &directory);
let (id, repo_owner, repo_name, repo_branch, readme_url) =
build_repo_info_from_lock(&agents_lock, &directory);
let skill = InstalledSkill {
id: format!("local:{directory}"),
id,
name,
description,
directory,
repo_owner: None,
repo_name: None,
repo_branch: None,
readme_url: None,
repo_owner,
repo_name,
repo_branch,
readme_url,
apps,
installed_at: chrono::Utc::now().timestamp(),
};

View File

@@ -8,6 +8,7 @@ use std::time::Duration;
use crate::error::AppError;
use crate::proxy::http_client;
use futures::StreamExt;
const DEFAULT_TIMEOUT_SECS: u64 = 30;
/// Timeout for large file transfers (PUT/GET of db.sql, skills.zip).
@@ -237,15 +238,7 @@ pub async fn put_bytes(
)
.send()
.await
.map_err(|e| {
webdav_transport_error(
"webdav.put_failed",
"PUT 请求",
"PUT request",
url,
&e,
)
})?;
.map_err(|e| webdav_transport_error("webdav.put_failed", "PUT 请求", "PUT request", url, &e))?;
if resp.status().is_success() {
return Ok(());
@@ -259,6 +252,7 @@ pub async fn put_bytes(
pub async fn get_bytes(
url: &str,
auth: &WebDavAuth,
max_bytes: usize,
) -> Result<Option<(Vec<u8>, Option<String>)>, AppError> {
let client = http_client::get();
let resp = apply_auth(
@@ -269,15 +263,7 @@ pub async fn get_bytes(
)
.send()
.await
.map_err(|e| {
webdav_transport_error(
"webdav.get_failed",
"GET 请求",
"GET request",
url,
&e,
)
})?;
.map_err(|e| webdav_transport_error("webdav.get_failed", "GET 请求", "GET request", url, &e))?;
if resp.status() == StatusCode::NOT_FOUND {
return Ok(None);
@@ -285,22 +271,29 @@ pub async fn get_bytes(
if !resp.status().is_success() {
return Err(webdav_status_error("GET", resp.status(), url));
}
ensure_content_length_within_limit(resp.headers(), max_bytes, url)?;
let etag = resp
.headers()
.get("etag")
.and_then(|v| v.to_str().ok())
.map(|s| s.to_string());
let bytes = resp
.bytes()
.await
.map_err(|e| {
let mut bytes = Vec::new();
let mut stream = resp.bytes_stream();
while let Some(chunk) = stream.next().await {
let chunk = chunk.map_err(|e| {
AppError::localized(
"webdav.response_read_failed",
format!("读取 WebDAV 响应失败: {e}"),
format!("Failed to read WebDAV response: {e}"),
)
})?;
Ok(Some((bytes.to_vec(), etag)))
if bytes.len().saturating_add(chunk.len()) > max_bytes {
return Err(response_too_large_error(url, max_bytes));
}
bytes.extend_from_slice(&chunk);
}
Ok(Some((bytes, etag)))
}
/// HEAD request to retrieve the ETag. Returns `None` on 404.
@@ -315,13 +308,7 @@ pub async fn head_etag(url: &str, auth: &WebDavAuth) -> Result<Option<String>, A
.send()
.await
.map_err(|e| {
webdav_transport_error(
"webdav.head_failed",
"HEAD 请求",
"HEAD request",
url,
&e,
)
webdav_transport_error("webdav.head_failed", "HEAD 请求", "HEAD request", url, &e)
})?;
if resp.status() == StatusCode::NOT_FOUND {
@@ -386,9 +373,7 @@ pub fn webdav_status_error(op: &str, status: StatusCode, url: &str) -> AppError
if matches!(status, StatusCode::UNAUTHORIZED | StatusCode::FORBIDDEN) {
if jgy {
zh.push_str(
"。坚果云请使用「第三方应用密码」,并确认地址指向 /dav/ 下的目录。",
);
zh.push_str("。坚果云请使用「第三方应用密码」,并确认地址指向 /dav/ 下的目录。");
en.push_str(
". For Jianguoyun, use an app-specific password and ensure the URL points under /dav/.",
);
@@ -401,9 +386,7 @@ pub fn webdav_status_error(op: &str, status: StatusCode, url: &str) -> AppError
en.push_str(". Common Jianguoyun cause: URL is outside a writable /dav/ directory.");
} else if op == "MKCOL" && status == StatusCode::CONFLICT {
if jgy {
zh.push_str(
"。坚果云不允许自动创建顶层文件夹,请先在网页端手动创建后重试。",
);
zh.push_str("。坚果云不允许自动创建顶层文件夹,请先在网页端手动创建后重试。");
en.push_str(
". Jianguoyun does not allow creating top-level folders automatically; create it manually first.",
);
@@ -446,9 +429,47 @@ fn redact_url(raw: &str) -> String {
}
}
fn response_too_large_error(url: &str, max_bytes: usize) -> AppError {
let max_mb = max_bytes / 1024 / 1024;
AppError::localized(
"webdav.response_too_large",
format!(
"WebDAV 响应体超过上限({} MB: {}",
max_mb,
redact_url(url)
),
format!(
"WebDAV response body exceeds limit ({} MB): {}",
max_mb,
redact_url(url)
),
)
}
fn ensure_content_length_within_limit(
headers: &reqwest::header::HeaderMap,
max_bytes: usize,
url: &str,
) -> Result<(), AppError> {
let Some(content_length) = headers.get(reqwest::header::CONTENT_LENGTH) else {
return Ok(());
};
let Ok(raw) = content_length.to_str() else {
return Ok(());
};
let Ok(value) = raw.parse::<u64>() else {
return Ok(());
};
if value > max_bytes as u64 {
return Err(response_too_large_error(url, max_bytes));
}
Ok(())
}
#[cfg(test)]
mod tests {
use super::*;
use reqwest::header::{HeaderMap, HeaderValue, CONTENT_LENGTH};
#[test]
fn build_remote_url_encodes_path_segments() {
@@ -498,10 +519,34 @@ mod tests {
#[test]
fn redact_url_hides_credentials_and_query_values() {
let redacted = redact_url("https://alice:secret@example.com:8443/dav?token=abc&foo=1");
assert_eq!(
redacted,
"https://example.com:8443/dav?[keys:foo,token]"
);
assert_eq!(redacted, "https://example.com:8443/dav?[keys:foo,token]");
assert!(!redacted.contains("secret"));
}
#[test]
fn ensure_content_length_within_limit_accepts_missing_or_small_values() {
let empty = HeaderMap::new();
assert!(
ensure_content_length_within_limit(&empty, 1024, "https://dav.example.com").is_ok()
);
let mut small = HeaderMap::new();
small.insert(CONTENT_LENGTH, HeaderValue::from_static("1024"));
assert!(
ensure_content_length_within_limit(&small, 1024, "https://dav.example.com").is_ok()
);
}
#[test]
fn ensure_content_length_within_limit_rejects_oversized_values() {
let mut large = HeaderMap::new();
large.insert(CONTENT_LENGTH, HeaderValue::from_static("2048"));
let err = ensure_content_length_within_limit(&large, 1024, "https://dav.example.com")
.expect_err("oversized response should be rejected");
assert!(
err.to_string().contains("too large") || err.to_string().contains("超过"),
"unexpected error: {err}"
);
}
}

View File

@@ -0,0 +1,277 @@
use std::sync::atomic::{AtomicUsize, Ordering};
use std::sync::Arc;
use std::sync::OnceLock;
use std::time::{Duration, Instant};
use serde_json::json;
use tauri::{AppHandle, Emitter};
use tokio::sync::mpsc::error::TrySendError;
use tokio::sync::mpsc::{channel, Receiver, Sender};
use crate::error::AppError;
use crate::services::webdav_sync as webdav_sync_service;
use crate::settings::{self, WebDavSyncSettings};
const AUTO_SYNC_DEBOUNCE_MS: u64 = 1000;
pub(crate) const MAX_AUTO_SYNC_WAIT_MS: u64 = 10_000;
static DB_CHANGE_TX: OnceLock<Sender<String>> = OnceLock::new();
static AUTO_SYNC_SUPPRESS_DEPTH: AtomicUsize = AtomicUsize::new(0);
pub(crate) struct AutoSyncSuppressionGuard;
impl AutoSyncSuppressionGuard {
pub fn new() -> Self {
AUTO_SYNC_SUPPRESS_DEPTH.fetch_add(1, Ordering::SeqCst);
Self
}
}
impl Drop for AutoSyncSuppressionGuard {
fn drop(&mut self) {
let _ =
AUTO_SYNC_SUPPRESS_DEPTH.fetch_update(Ordering::SeqCst, Ordering::SeqCst, |value| {
Some(value.saturating_sub(1))
});
}
}
pub(crate) fn is_auto_sync_suppressed() -> bool {
AUTO_SYNC_SUPPRESS_DEPTH.load(Ordering::SeqCst) > 0
}
pub fn should_trigger_for_table(table: &str) -> bool {
let normalized = table.trim().to_ascii_lowercase();
matches!(
normalized.as_str(),
"providers"
| "provider_endpoints"
| "mcp_servers"
| "prompts"
| "skills"
| "skill_repos"
| "settings"
| "proxy_config"
)
}
pub(crate) fn enqueue_change_signal(tx: &Sender<String>, table: &str) -> bool {
match tx.try_send(table.to_string()) {
Ok(()) => true,
Err(TrySendError::Full(_)) | Err(TrySendError::Closed(_)) => false,
}
}
pub(crate) fn auto_sync_wait_duration(started_at: Instant, now: Instant) -> Option<Duration> {
let max_wait = Duration::from_millis(MAX_AUTO_SYNC_WAIT_MS);
let debounce = Duration::from_millis(AUTO_SYNC_DEBOUNCE_MS);
let elapsed = now.saturating_duration_since(started_at);
if elapsed >= max_wait {
return None;
}
Some(debounce.min(max_wait - elapsed))
}
fn should_run_auto_sync(settings: Option<&WebDavSyncSettings>) -> bool {
let Some(sync) = settings else {
return false;
};
sync.enabled && sync.auto_sync
}
fn persist_auto_sync_error(settings: &mut WebDavSyncSettings, error: &AppError) {
settings.status.last_error = Some(error.to_string());
settings.status.last_error_source = Some("auto".to_string());
let _ = settings::update_webdav_sync_status(settings.status.clone());
}
fn emit_auto_sync_status_updated(app: &AppHandle, status: &str, error: Option<&str>) {
let payload = match error {
Some(message) => json!({
"source": "auto",
"status": status,
"error": message,
}),
None => json!({
"source": "auto",
"status": status,
}),
};
if let Err(err) = app.emit("webdav-sync-status-updated", payload) {
log::debug!("[WebDAV] failed to emit sync status update event: {err}");
}
}
async fn run_auto_sync_upload(
db: &crate::database::Database,
app: &AppHandle,
) -> Result<(), AppError> {
let mut settings = settings::get_webdav_sync_settings();
if !should_run_auto_sync(settings.as_ref()) {
return Ok(());
}
let mut sync_settings = match settings.take() {
Some(value) => value,
None => return Ok(()),
};
let result = webdav_sync_service::run_with_sync_lock(webdav_sync_service::upload(
db,
&mut sync_settings,
))
.await;
match result {
Ok(_) => {
emit_auto_sync_status_updated(app, "success", None);
Ok(())
}
Err(err) => {
persist_auto_sync_error(&mut sync_settings, &err);
emit_auto_sync_status_updated(app, "error", Some(&err.to_string()));
Err(err)
}
}
}
pub fn notify_db_changed(table: &str) {
if is_auto_sync_suppressed() {
return;
}
if !should_trigger_for_table(table) {
return;
}
let Some(tx) = DB_CHANGE_TX.get() else {
return;
};
let _ = enqueue_change_signal(tx, table);
}
pub fn start_worker(db: Arc<crate::database::Database>, app: tauri::AppHandle) {
if DB_CHANGE_TX.get().is_some() {
return;
}
// Buffer size 1 is enough: we only need "dirty" signals, not every event.
let (tx, rx) = channel::<String>(1);
if DB_CHANGE_TX.set(tx).is_err() {
return;
}
tauri::async_runtime::spawn(async move {
run_worker_loop(db, rx, app).await;
});
}
async fn run_worker_loop(
db: Arc<crate::database::Database>,
mut rx: Receiver<String>,
app: tauri::AppHandle,
) {
while let Some(first_table) = rx.recv().await {
let started_at = Instant::now();
let mut merged_count = 1usize;
loop {
let Some(wait_for) = auto_sync_wait_duration(started_at, Instant::now()) else {
break;
};
let timeout = tokio::time::timeout(wait_for, rx.recv()).await;
match timeout {
Ok(Some(_)) => merged_count += 1,
Ok(None) => return,
Err(_) => break,
}
}
log::debug!(
"[WebDAV][AutoSync] Triggered by table={first_table}, merged_changes={merged_count}"
);
if let Err(err) = run_auto_sync_upload(&db, &app).await {
log::warn!("[WebDAV][AutoSync] Upload failed: {err}");
}
}
}
#[cfg(test)]
mod tests {
use super::{
auto_sync_wait_duration, enqueue_change_signal, is_auto_sync_suppressed,
should_run_auto_sync, should_trigger_for_table, AutoSyncSuppressionGuard,
MAX_AUTO_SYNC_WAIT_MS,
};
use crate::settings::WebDavSyncSettings;
use std::time::{Duration, Instant};
use tokio::sync::mpsc::channel;
#[test]
fn should_trigger_sync_for_config_tables_only() {
assert!(should_trigger_for_table("providers"));
assert!(should_trigger_for_table("settings"));
assert!(!should_trigger_for_table("proxy_request_logs"));
assert!(!should_trigger_for_table("provider_health"));
}
#[test]
fn suppression_guard_enables_and_restores_state() {
assert!(!is_auto_sync_suppressed());
{
let _guard = AutoSyncSuppressionGuard::new();
assert!(is_auto_sync_suppressed());
}
assert!(!is_auto_sync_suppressed());
}
#[test]
fn max_wait_caps_flush_latency_for_continuous_events() {
let started = Instant::now();
let later = started + Duration::from_millis(MAX_AUTO_SYNC_WAIT_MS + 1);
assert!(auto_sync_wait_duration(started, later).is_none());
}
#[tokio::test]
async fn enqueue_change_signal_drops_when_channel_is_full() {
let (tx, _rx) = channel::<String>(1);
assert!(enqueue_change_signal(&tx, "providers"));
assert!(!enqueue_change_signal(&tx, "providers"));
}
#[test]
fn should_run_auto_sync_requires_enabled_and_auto_sync_flag() {
assert!(!should_run_auto_sync(None));
let disabled = WebDavSyncSettings {
enabled: false,
auto_sync: true,
..WebDavSyncSettings::default()
};
assert!(!should_run_auto_sync(Some(&disabled)));
let auto_sync_off = WebDavSyncSettings {
enabled: true,
auto_sync: false,
..WebDavSyncSettings::default()
};
assert!(!should_run_auto_sync(Some(&auto_sync_off)));
let enabled = WebDavSyncSettings {
enabled: true,
auto_sync: true,
..WebDavSyncSettings::default()
};
assert!(should_run_auto_sync(Some(&enabled)));
}
#[test]
fn service_layer_does_not_depend_on_commands_layer() {
let source = include_str!("webdav_auto_sync.rs");
let needle = ["crate", "commands", ""].join("::");
assert!(
!source.contains(&needle),
"services layer should not depend on commands layer"
);
}
}

View File

@@ -5,7 +5,9 @@
use std::collections::BTreeMap;
use std::fs;
use std::future::Future;
use std::process::Command;
use std::sync::OnceLock;
use chrono::Utc;
use serde::{Deserialize, Serialize};
@@ -33,6 +35,21 @@ const REMOTE_DB_SQL: &str = "db.sql";
const REMOTE_SKILLS_ZIP: &str = "skills.zip";
const REMOTE_MANIFEST: &str = "manifest.json";
const MAX_DEVICE_NAME_LEN: usize = 64;
const MAX_MANIFEST_BYTES: usize = 1024 * 1024;
pub(super) const MAX_SYNC_ARTIFACT_BYTES: u64 = 512 * 1024 * 1024;
pub fn sync_mutex() -> &'static tokio::sync::Mutex<()> {
static LOCK: OnceLock<tokio::sync::Mutex<()>> = OnceLock::new();
LOCK.get_or_init(|| tokio::sync::Mutex::new(()))
}
pub async fn run_with_sync_lock<T, Fut>(operation: Fut) -> Result<T, AppError>
where
Fut: Future<Output = Result<T, AppError>>,
{
let _guard = sync_mutex().lock().await;
operation.await
}
fn localized(key: &'static str, zh: impl Into<String>, en: impl Into<String>) -> AppError {
AppError::localized(key, zh, en)
@@ -145,13 +162,15 @@ pub async fn download(
let auth = auth_for(settings);
let manifest_url = remote_file_url(settings, REMOTE_MANIFEST)?;
let (manifest_bytes, etag) = get_bytes(&manifest_url, &auth).await?.ok_or_else(|| {
localized(
"webdav.sync.remote_empty",
"远端没有可下载的同步数据",
"No downloadable sync data found on the remote.",
)
})?;
let (manifest_bytes, etag) = get_bytes(&manifest_url, &auth, MAX_MANIFEST_BYTES)
.await?
.ok_or_else(|| {
localized(
"webdav.sync.remote_empty",
"远端没有可下载的同步数据",
"No downloadable sync data found on the remote.",
)
})?;
let manifest: SyncManifest =
serde_json::from_slice(&manifest_bytes).map_err(|e| AppError::Json {
@@ -181,7 +200,7 @@ pub async fn fetch_remote_info(settings: &WebDavSyncSettings) -> Result<Option<V
let auth = auth_for(settings);
let manifest_url = remote_file_url(settings, REMOTE_MANIFEST)?;
let Some((bytes, _)) = get_bytes(&manifest_url, &auth).await? else {
let Some((bytes, _)) = get_bytes(&manifest_url, &auth, MAX_MANIFEST_BYTES).await? else {
return Ok(None);
};
@@ -214,6 +233,7 @@ fn persist_sync_success(
let status = WebDavSyncStatus {
last_sync_at: Some(Utc::now().timestamp()),
last_error: None,
last_error_source: None,
last_local_manifest_hash: Some(manifest_hash.clone()),
last_remote_manifest_hash: Some(manifest_hash),
last_remote_etag: etag,
@@ -319,14 +339,10 @@ fn sha256_hex(bytes: &[u8]) -> String {
}
fn detect_system_device_name() -> Option<String> {
let env_name = [
"CC_SWITCH_DEVICE_NAME",
"COMPUTERNAME",
"HOSTNAME",
]
.iter()
.filter_map(|key| std::env::var(key).ok())
.find_map(|value| normalize_device_name(&value));
let env_name = ["CC_SWITCH_DEVICE_NAME", "COMPUTERNAME", "HOSTNAME"]
.iter()
.filter_map(|key| std::env::var(key).ok())
.find_map(|value| normalize_device_name(&value));
if env_name.is_some() {
return env_name;
@@ -341,21 +357,26 @@ fn detect_system_device_name() -> Option<String> {
}
fn normalize_device_name(raw: &str) -> Option<String> {
let compact = raw.chars().fold(String::with_capacity(raw.len()), |mut acc, ch| {
if ch.is_whitespace() {
acc.push(' ');
} else if !ch.is_control() {
acc.push(ch);
}
acc
});
let compact = raw
.chars()
.fold(String::with_capacity(raw.len()), |mut acc, ch| {
if ch.is_whitespace() {
acc.push(' ');
} else if !ch.is_control() {
acc.push(ch);
}
acc
});
let normalized = compact.split_whitespace().collect::<Vec<_>>().join(" ");
let trimmed = normalized.trim();
if trimmed.is_empty() {
return None;
}
let limited = trimmed.chars().take(MAX_DEVICE_NAME_LEN).collect::<String>();
let limited = trimmed
.chars()
.take(MAX_DEVICE_NAME_LEN)
.collect::<String>();
if limited.is_empty() {
None
} else {
@@ -405,14 +426,18 @@ async fn download_and_verify(
format!("Manifest missing artifact: {artifact_name}"),
)
})?;
validate_artifact_size_limit(artifact_name, meta.size)?;
let url = remote_file_url(settings, artifact_name)?;
let (bytes, _) = get_bytes(&url, auth).await?.ok_or_else(|| {
localized(
"webdav.sync.remote_missing_artifact",
format!("远端缺少 artifact 文件: {artifact_name}"),
format!("Remote artifact file missing: {artifact_name}"),
)
})?;
let (bytes, _) = get_bytes(&url, auth, MAX_SYNC_ARTIFACT_BYTES as usize)
.await?
.ok_or_else(|| {
localized(
"webdav.sync.remote_missing_artifact",
format!("远端缺少 artifact 文件: {artifact_name}"),
format!("Remote artifact file missing: {artifact_name}"),
)
})?;
// Quick size check before expensive hash
if bytes.len() as u64 != meta.size {
@@ -503,6 +528,21 @@ fn auth_for(settings: &WebDavSyncSettings) -> WebDavAuth {
auth_from_credentials(&settings.username, &settings.password)
}
fn validate_artifact_size_limit(artifact_name: &str, size: u64) -> Result<(), AppError> {
if size > MAX_SYNC_ARTIFACT_BYTES {
let max_mb = MAX_SYNC_ARTIFACT_BYTES / 1024 / 1024;
return Err(localized(
"webdav.sync.artifact_too_large",
format!("artifact {artifact_name} 超过下载上限({} MB", max_mb),
format!(
"Artifact {artifact_name} exceeds download limit ({} MB)",
max_mb
),
));
}
Ok(())
}
// ─── Tests ───────────────────────────────────────────────────
#[cfg(test)]
@@ -646,4 +686,19 @@ mod tests {
"manifest should not contain deviceId"
);
}
#[test]
fn validate_artifact_size_limit_rejects_oversized_artifacts() {
let err = validate_artifact_size_limit("skills.zip", MAX_SYNC_ARTIFACT_BYTES + 1)
.expect_err("artifact larger than limit should be rejected");
assert!(
err.to_string().contains("too large") || err.to_string().contains("超过"),
"unexpected error: {err}"
);
}
#[test]
fn validate_artifact_size_limit_accepts_limit_boundary() {
assert!(validate_artifact_size_limit("skills.zip", MAX_SYNC_ARTIFACT_BYTES).is_ok());
}
}

View File

@@ -10,10 +10,8 @@ use zip::DateTime;
use crate::error::AppError;
use crate::services::skill::SkillService;
use super::{io_context_localized, localized, REMOTE_SKILLS_ZIP};
use super::{io_context_localized, localized, MAX_SYNC_ARTIFACT_BYTES, REMOTE_SKILLS_ZIP};
/// Maximum total bytes allowed during zip extraction (512 MB).
const MAX_EXTRACT_BYTES: u64 = 512 * 1024 * 1024;
/// Maximum number of entries allowed in a zip archive.
const MAX_EXTRACT_ENTRIES: usize = 10_000;
@@ -92,8 +90,14 @@ pub(super) fn restore_skills_zip(raw: &[u8]) -> Result<(), AppError> {
if archive.len() > MAX_EXTRACT_ENTRIES {
return Err(localized(
"webdav.sync.skills_zip_too_many_entries",
format!("skills.zip 条目数过多({}),上限 {MAX_EXTRACT_ENTRIES}", archive.len()),
format!("skills.zip has too many entries ({}), limit is {MAX_EXTRACT_ENTRIES}", archive.len()),
format!(
"skills.zip 条目数过多({}),上限 {MAX_EXTRACT_ENTRIES}",
archive.len()
),
format!(
"skills.zip has too many entries ({}), limit is {MAX_EXTRACT_ENTRIES}",
archive.len()
),
));
}
@@ -118,15 +122,13 @@ pub(super) fn restore_skills_zip(raw: &[u8]) -> Result<(), AppError> {
fs::create_dir_all(parent).map_err(|e| AppError::io(parent, e))?;
}
let mut out = fs::File::create(&out_path).map_err(|e| AppError::io(&out_path, e))?;
let written = std::io::copy(&mut entry, &mut out).map_err(|e| AppError::io(&out_path, e))?;
total_bytes += written;
if total_bytes > MAX_EXTRACT_BYTES {
return Err(localized(
"webdav.sync.skills_zip_too_large",
format!("skills.zip 解压后体积超过上限({} MB", MAX_EXTRACT_BYTES / 1024 / 1024),
format!("skills.zip extracted size exceeds limit ({} MB)", MAX_EXTRACT_BYTES / 1024 / 1024),
));
}
let _written = copy_entry_with_total_limit(
&mut entry,
&mut out,
&mut total_bytes,
MAX_SYNC_ARTIFACT_BYTES,
&out_path,
)?;
}
let ssot = SkillService::get_ssot_dir().map_err(|e| {
@@ -327,10 +329,47 @@ fn mark_visited_dir(path: &Path, visited: &mut HashSet<PathBuf>) -> Result<bool,
Ok(visited.insert(canonical))
}
fn copy_entry_with_total_limit<R: Read, W: Write>(
reader: &mut R,
writer: &mut W,
total_bytes: &mut u64,
max_total_bytes: u64,
out_path: &Path,
) -> Result<u64, AppError> {
let mut buffer = [0u8; 16 * 1024];
let mut written = 0u64;
loop {
let n = reader
.read(&mut buffer)
.map_err(|e| AppError::io(out_path, e))?;
if n == 0 {
break;
}
if total_bytes.saturating_add(n as u64) > max_total_bytes {
let max_mb = max_total_bytes / 1024 / 1024;
return Err(localized(
"webdav.sync.skills_zip_too_large",
format!("skills.zip 解压后体积超过上限({} MB", max_mb),
format!("skills.zip extracted size exceeds limit ({} MB)", max_mb),
));
}
writer
.write_all(&buffer[..n])
.map_err(|e| AppError::io(out_path, e))?;
*total_bytes += n as u64;
written += n as u64;
}
Ok(written)
}
#[cfg(test)]
mod tests {
use super::mark_visited_dir;
use super::{copy_entry_with_total_limit, mark_visited_dir};
use std::collections::HashSet;
use std::io::Cursor;
use std::path::Path;
use tempfile::tempdir;
#[test]
@@ -343,4 +382,29 @@ mod tests {
assert!(mark_visited_dir(&dir, &mut visited).expect("first visit"));
assert!(!mark_visited_dir(&dir, &mut visited).expect("second visit"));
}
#[test]
fn copy_entry_with_total_limit_rejects_oversized_stream_before_write() {
let mut reader = Cursor::new(vec![1u8; 16]);
let mut writer = Vec::new();
let mut total_bytes = 0u64;
let err = copy_entry_with_total_limit(
&mut reader,
&mut writer,
&mut total_bytes,
8,
Path::new("skills-extracted/file.bin"),
)
.expect_err("stream larger than limit should be rejected");
assert!(
err.to_string().contains("too large") || err.to_string().contains("超过"),
"unexpected error: {err}"
);
assert_eq!(
writer.len(),
0,
"should not write when the first chunk exceeds limit"
);
}
}

View File

@@ -72,6 +72,8 @@ pub struct WebDavSyncStatus {
#[serde(default, skip_serializing_if = "Option::is_none")]
pub last_error: Option<String>,
#[serde(default, skip_serializing_if = "Option::is_none")]
pub last_error_source: Option<String>,
#[serde(default, skip_serializing_if = "Option::is_none")]
pub last_remote_etag: Option<String>,
#[serde(default, skip_serializing_if = "Option::is_none")]
pub last_local_manifest_hash: Option<String>,
@@ -93,6 +95,8 @@ pub struct WebDavSyncSettings {
#[serde(default)]
pub enabled: bool,
#[serde(default)]
pub auto_sync: bool,
#[serde(default)]
pub base_url: String,
#[serde(default)]
pub username: String,
@@ -110,6 +114,7 @@ impl Default for WebDavSyncSettings {
fn default() -> Self {
Self {
enabled: false,
auto_sync: false,
base_url: String::new(),
username: String::new(),
password: String::new(),

View File

@@ -3,6 +3,7 @@ import { useTranslation } from "react-i18next";
import { motion, AnimatePresence } from "framer-motion";
import { toast } from "sonner";
import { invoke } from "@tauri-apps/api/core";
import { listen } from "@tauri-apps/api/event";
import { useQueryClient } from "@tanstack/react-query";
import {
Plus,
@@ -81,6 +82,12 @@ type View =
| "openclawTools"
| "openclawAgents";
interface WebDavSyncStatusUpdatedPayload {
source?: string;
status?: string;
error?: string;
}
const DRAG_BAR_HEIGHT = isWindows() || isLinux() ? 0 : 28; // px
const HEADER_HEIGHT = 64; // px
const CONTENT_TOP_OFFSET = DRAG_BAR_HEIGHT + HEADER_HEIGHT;
@@ -295,6 +302,49 @@ function App() {
};
}, [queryClient]);
useEffect(() => {
let unsubscribe: (() => void) | undefined;
let active = true;
const setupListener = async () => {
try {
const off = await listen(
"webdav-sync-status-updated",
async (event) => {
const payload = (event.payload ?? {}) as WebDavSyncStatusUpdatedPayload;
await queryClient.invalidateQueries({ queryKey: ["settings"] });
if (payload.source !== "auto" || payload.status !== "error") {
return;
}
toast.error(
t("settings.webdavSync.autoSyncFailedToast", {
error: payload.error || t("common.unknown"),
}),
);
},
);
if (!active) {
off();
return;
}
unsubscribe = off;
} catch (error) {
console.error(
"[App] Failed to subscribe webdav-sync-status-updated event",
error,
);
}
};
void setupListener();
return () => {
active = false;
unsubscribe?.();
};
}, [queryClient, t]);
useEffect(() => {
const checkEnvOnStartup = async () => {
try {

View File

@@ -1,5 +1,5 @@
import { useTranslation } from "react-i18next";
import { useEffect, useState } from "react";
import { useEffect, useState, useCallback, useMemo } from "react";
import { FullScreenPanel } from "@/components/common/FullScreenPanel";
import { Label } from "@/components/ui/label";
import { Button } from "@/components/ui/button";
@@ -53,6 +53,81 @@ export function CommonConfigEditor({
return () => observer.disconnect();
}, []);
// Mirror value prop to local state so checkbox toggles and JsonEditor stay in sync
// (parent uses form.getValues which doesn't trigger re-renders)
const [localValue, setLocalValue] = useState(value);
useEffect(() => {
setLocalValue(value);
}, [value]);
const handleLocalChange = useCallback(
(newValue: string) => {
setLocalValue(newValue);
onChange(newValue);
},
[onChange],
);
const toggleStates = useMemo(() => {
try {
const config = JSON.parse(localValue);
return {
hideAttribution:
config?.attribution?.commit === "" && config?.attribution?.pr === "",
alwaysThinking: config?.alwaysThinkingEnabled === true,
teammates:
config?.env?.CLAUDE_CODE_EXPERIMENTAL_AGENT_TEAMS === "1" ||
config?.env?.CLAUDE_CODE_EXPERIMENTAL_AGENT_TEAMS === 1,
};
} catch {
return {
hideAttribution: false,
alwaysThinking: false,
teammates: false,
};
}
}, [localValue]);
const handleToggle = useCallback(
(toggleKey: string, checked: boolean) => {
try {
const config = JSON.parse(localValue || "{}");
switch (toggleKey) {
case "hideAttribution":
if (checked) {
config.attribution = { commit: "", pr: "" };
} else {
delete config.attribution;
}
break;
case "alwaysThinking":
if (checked) {
config.alwaysThinkingEnabled = true;
} else {
delete config.alwaysThinkingEnabled;
}
break;
case "teammates":
if (!config.env) config.env = {};
if (checked) {
config.env.CLAUDE_CODE_EXPERIMENTAL_AGENT_TEAMS = "1";
} else {
delete config.env.CLAUDE_CODE_EXPERIMENTAL_AGENT_TEAMS;
if (Object.keys(config.env).length === 0) delete config.env;
}
break;
}
handleLocalChange(JSON.stringify(config, null, 2));
} catch {
// Don't modify if JSON is invalid
}
},
[localValue, handleLocalChange],
);
return (
<>
<div className="space-y-2">
@@ -91,9 +166,40 @@ export function CommonConfigEditor({
{commonConfigError}
</p>
)}
<div className="flex flex-wrap items-center gap-x-4 gap-y-1">
<label className="inline-flex items-center gap-2 text-sm text-muted-foreground cursor-pointer">
<input
type="checkbox"
checked={toggleStates.hideAttribution}
onChange={(e) =>
handleToggle("hideAttribution", e.target.checked)
}
className="w-4 h-4 text-blue-500 bg-white dark:bg-gray-800 border-border-default rounded focus:ring-blue-500 dark:focus:ring-blue-400 focus:ring-2"
/>
<span>{t("claudeConfig.hideAttribution")}</span>
</label>
<label className="inline-flex items-center gap-2 text-sm text-muted-foreground cursor-pointer">
<input
type="checkbox"
checked={toggleStates.alwaysThinking}
onChange={(e) => handleToggle("alwaysThinking", e.target.checked)}
className="w-4 h-4 text-blue-500 bg-white dark:bg-gray-800 border-border-default rounded focus:ring-blue-500 dark:focus:ring-blue-400 focus:ring-2"
/>
<span>{t("claudeConfig.alwaysThinking")}</span>
</label>
<label className="inline-flex items-center gap-2 text-sm text-muted-foreground cursor-pointer">
<input
type="checkbox"
checked={toggleStates.teammates}
onChange={(e) => handleToggle("teammates", e.target.checked)}
className="w-4 h-4 text-blue-500 bg-white dark:bg-gray-800 border-border-default rounded focus:ring-blue-500 dark:focus:ring-blue-400 focus:ring-2"
/>
<span>{t("claudeConfig.enableTeammates")}</span>
</label>
</div>
<JsonEditor
value={value}
onChange={onChange}
value={localValue}
onChange={handleLocalChange}
placeholder={`{
"env": {
"ANTHROPIC_BASE_URL": "https://your-api-endpoint.com",

View File

@@ -38,7 +38,7 @@ export function DirectorySettings({
const { t } = useTranslation();
return (
<>
<div className="space-y-6">
{/* CC Switch 配置目录 - 独立区块 */}
<section className="space-y-4">
<header className="space-y-1">
@@ -131,7 +131,7 @@ export function DirectorySettings({
onReset={() => onResetDirectory("opencode")}
/>
</section>
</>
</div>
);
}

View File

@@ -53,7 +53,7 @@ export function ImportExportSection({
</p>
</header>
<div className="space-y-4 rounded-xl glass-card p-6 border border-white/10">
<div className="space-y-4 rounded-lg border border-border bg-muted/40 p-6">
{/* Import and Export Buttons Side by Side */}
<div className="grid grid-cols-2 gap-4 items-stretch">
{/* Import Button */}

View File

@@ -16,6 +16,7 @@ import { useQueryClient } from "@tanstack/react-query";
import { toast } from "sonner";
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import { Switch } from "@/components/ui/switch";
import {
Select,
SelectContent,
@@ -162,6 +163,7 @@ export function WebdavSyncSection({ config }: WebdavSyncSectionProps) {
password: config?.password ?? "",
remoteRoot: config?.remoteRoot ?? "cc-switch-sync",
profile: config?.profile ?? "default",
autoSync: config?.autoSync ?? false,
}));
// Preset selector — derived from initial URL, updated on user selection
@@ -196,6 +198,7 @@ export function WebdavSyncSection({ config }: WebdavSyncSectionProps) {
password: config.password ?? "",
remoteRoot: config.remoteRoot ?? "cc-switch-sync",
profile: config.profile ?? "default",
autoSync: config.autoSync ?? false,
});
setPasswordTouched(false);
setPresetId(detectPreset(config.baseUrl ?? ""));
@@ -237,6 +240,16 @@ export function WebdavSyncSection({ config }: WebdavSyncSectionProps) {
}
}, [form.baseUrl, presetId]);
const handleAutoSyncChange = useCallback((checked: boolean) => {
setForm((prev) => ({ ...prev, autoSync: checked }));
setDirty(true);
setJustSaved(false);
if (justSavedTimerRef.current) {
clearTimeout(justSavedTimerRef.current);
justSavedTimerRef.current = null;
}
}, []);
const buildSettings = useCallback((): WebDavSyncSettings | null => {
const baseUrl = form.baseUrl.trim();
if (!baseUrl) return null;
@@ -247,6 +260,7 @@ export function WebdavSyncSection({ config }: WebdavSyncSectionProps) {
password: form.password,
remoteRoot: form.remoteRoot.trim() || "cc-switch-sync",
profile: form.profile.trim() || "default",
autoSync: form.autoSync,
};
}, [form]);
@@ -433,6 +447,9 @@ export function WebdavSyncSection({ config }: WebdavSyncSectionProps) {
const lastSyncDisplay = lastSyncAt
? new Date(lastSyncAt * 1000).toLocaleString()
: null;
const lastError = config?.status?.lastError?.trim();
const showAutoSyncError =
!!lastError && config?.status?.lastErrorSource === "auto";
// ─── Render ─────────────────────────────────────────────
@@ -559,6 +576,23 @@ export function WebdavSyncSection({ config }: WebdavSyncSectionProps) {
disabled={isLoading}
/>
</div>
<div className="flex items-start gap-4">
<label className="w-40 text-xs font-medium text-foreground shrink-0">
{t("settings.webdavSync.autoSync")}
<span className="block text-[10px] font-normal text-muted-foreground">
{t("settings.webdavSync.autoSyncHint")}
</span>
</label>
<div className="pt-1">
<Switch
checked={form.autoSync}
onCheckedChange={handleAutoSyncChange}
aria-label={t("settings.webdavSync.autoSync")}
disabled={isLoading}
/>
</div>
</div>
</div>
{/* Last sync time */}
@@ -567,6 +601,17 @@ export function WebdavSyncSection({ config }: WebdavSyncSectionProps) {
{t("settings.webdavSync.lastSync", { time: lastSyncDisplay })}
</p>
)}
{showAutoSyncError && (
<div className="rounded-lg border border-red-300/70 bg-red-50/80 px-3 py-2 text-xs text-red-900 dark:border-red-500/50 dark:bg-red-950/30 dark:text-red-200">
<p className="font-medium">
{t("settings.webdavSync.autoSyncLastErrorTitle")}
</p>
<p className="mt-1 break-all whitespace-pre-wrap">{lastError}</p>
<p className="mt-1 text-[11px] text-red-700/90 dark:text-red-300/80">
{t("settings.webdavSync.autoSyncLastErrorHint")}
</p>
</div>
)}
{/* Config buttons + save status */}
<div className="flex flex-wrap items-center gap-3 pt-2">

View File

@@ -213,14 +213,12 @@ export const SkillsPage = forwardRef<SkillsPageHandle, SkillsPageProps>(
const query = searchQuery.toLowerCase();
return byStatus.filter((skill) => {
const name = skill.name?.toLowerCase() || "";
const description = skill.description?.toLowerCase() || "";
const directory = skill.directory?.toLowerCase() || "";
const repo =
skill.repoOwner && skill.repoName
? `${skill.repoOwner}/${skill.repoName}`.toLowerCase()
: "";
return (
name.includes(query) ||
description.includes(query) ||
directory.includes(query)
);
return name.includes(query) || repo.includes(query);
});
}, [skills, searchQuery, filterRepo, filterStatus]);

View File

@@ -306,6 +306,7 @@ interface ImportSkillsDialogProps {
name: string;
description?: string;
foundIn: string[];
path: string;
}>;
onImport: (directories: string[]) => void;
onClose: () => void;
@@ -362,8 +363,11 @@ const ImportSkillsDialog: React.FC<ImportSkillsDialogProps> = ({
{skill.description}
</div>
)}
<div className="text-xs text-muted-foreground/70 mt-1">
{t("skills.foundIn")}: {skill.foundIn.join(", ")}
<div
className="text-xs text-muted-foreground/50 mt-1 truncate"
title={skill.path}
>
{skill.path}
</div>
</div>
</label>

View File

@@ -58,7 +58,10 @@
"extractFromCurrent": "Extract from Editor",
"extractNoCommonConfig": "No common config available to extract from editor",
"extractFailed": "Extract failed: {{error}}",
"saveFailed": "Save failed: {{error}}"
"saveFailed": "Save failed: {{error}}",
"hideAttribution": "Hide AI Attribution",
"alwaysThinking": "Extended Thinking",
"enableTeammates": "Teammates Mode"
},
"header": {
"viewOnGithub": "View on GitHub",
@@ -291,6 +294,8 @@
"passwordPlaceholder": "App password",
"remoteRoot": "Remote Root Directory",
"profile": "Sync Profile Name",
"autoSync": "Auto Sync",
"autoSyncHint": "When enabled, each database change triggers an automatic WebDAV upload.",
"test": "Test Connection",
"testing": "Testing...",
"testSuccess": "Connection successful",
@@ -302,11 +307,14 @@
"uploading": "Uploading...",
"uploadSuccess": "Uploaded to WebDAV",
"uploadFailed": "Upload failed: {{error}}",
"autoSyncFailedToast": "Auto sync failed: {{error}}",
"download": "Download from Cloud",
"downloading": "Downloading...",
"downloadSuccess": "Downloaded and restored from WebDAV",
"downloadFailed": "Download failed: {{error}}",
"lastSync": "Last sync: {{time}}",
"autoSyncLastErrorTitle": "Last auto sync failed",
"autoSyncLastErrorHint": "Please check network or WebDAV settings. Auto sync will retry on future changes.",
"missingUrl": "Please enter the WebDAV server URL",
"presets": {
"label": "Provider",
@@ -1336,7 +1344,7 @@
"skillCount": "{{count}} skills detected"
},
"search": "Search Skills",
"searchPlaceholder": "Search skill name or description...",
"searchPlaceholder": "Search skill name or repo...",
"filter": {
"placeholder": "Filter by status",
"all": "All",

View File

@@ -58,7 +58,10 @@
"extractFromCurrent": "編集内容から抽出",
"extractNoCommonConfig": "編集内容から抽出できる共通設定がありません",
"extractFailed": "抽出に失敗しました: {{error}}",
"saveFailed": "保存に失敗しました: {{error}}"
"saveFailed": "保存に失敗しました: {{error}}",
"hideAttribution": "AI署名を非表示",
"alwaysThinking": "拡張思考",
"enableTeammates": "Teammates モード"
},
"header": {
"viewOnGithub": "GitHub で見る",
@@ -291,6 +294,8 @@
"passwordPlaceholder": "アプリパスワード",
"remoteRoot": "リモートルートディレクトリ",
"profile": "同期プロファイル名",
"autoSync": "自動同期",
"autoSyncHint": "有効にすると、データベース変更のたびに WebDAV へ自動アップロードします。",
"test": "接続テスト",
"testing": "テスト中...",
"testSuccess": "接続成功",
@@ -302,11 +307,14 @@
"uploading": "アップロード中...",
"uploadSuccess": "WebDAV にアップロードしました",
"uploadFailed": "アップロードに失敗しました:{{error}}",
"autoSyncFailedToast": "自動同期に失敗しました:{{error}}",
"download": "クラウドからダウンロード",
"downloading": "ダウンロード中...",
"downloadSuccess": "WebDAV からダウンロード・復元しました",
"downloadFailed": "ダウンロードに失敗しました:{{error}}",
"lastSync": "前回の同期:{{time}}",
"autoSyncLastErrorTitle": "前回の自動同期に失敗しました",
"autoSyncLastErrorHint": "ネットワークまたは WebDAV 設定を確認してください。次回の変更時に自動再試行されます。",
"missingUrl": "WebDAV サーバー URL を入力してください",
"presets": {
"label": "サービス",
@@ -1334,7 +1342,7 @@
"skillCount": "{{count}} 件のスキルを検出"
},
"search": "スキルを検索",
"searchPlaceholder": "スキル名または説明で検索...",
"searchPlaceholder": "スキル名またはリポジトリで検索...",
"filter": {
"placeholder": "状態で絞り込み",
"all": "すべて",

View File

@@ -58,7 +58,10 @@
"extractFromCurrent": "从编辑内容提取",
"extractNoCommonConfig": "当前编辑内容没有可提取的通用配置",
"extractFailed": "提取失败: {{error}}",
"saveFailed": "保存失败: {{error}}"
"saveFailed": "保存失败: {{error}}",
"hideAttribution": "隐藏 AI 署名",
"alwaysThinking": "扩展思考",
"enableTeammates": "Teammates 模式"
},
"header": {
"viewOnGithub": "在 GitHub 上查看",
@@ -291,6 +294,8 @@
"passwordPlaceholder": "应用密码(坚果云请使用「第三方应用密码」)",
"remoteRoot": "远程根目录",
"profile": "同步配置名",
"autoSync": "自动同步",
"autoSyncHint": "开启后每次数据库变更都会自动上传到 WebDAV。",
"test": "测试连接",
"testing": "测试中...",
"testSuccess": "连接成功",
@@ -302,11 +307,14 @@
"uploading": "上传中...",
"uploadSuccess": "已上传到 WebDAV",
"uploadFailed": "上传失败:{{error}}",
"autoSyncFailedToast": "自动同步失败:{{error}}",
"download": "从云端下载",
"downloading": "下载中...",
"downloadSuccess": "已从 WebDAV 下载并恢复",
"downloadFailed": "下载失败:{{error}}",
"lastSync": "上次同步:{{time}}",
"autoSyncLastErrorTitle": "上次自动同步失败",
"autoSyncLastErrorHint": "请检查网络或 WebDAV 配置,系统会在后续变更时继续自动重试。",
"missingUrl": "请填写 WebDAV 服务器地址",
"presets": {
"label": "服务商",
@@ -1336,7 +1344,7 @@
"skillCount": "识别到 {{count}} 个技能"
},
"search": "搜索技能",
"searchPlaceholder": "搜索技能名称或描述...",
"searchPlaceholder": "搜索技能名称或仓库名称...",
"filter": {
"placeholder": "状态筛选",
"all": "全部",

View File

@@ -45,6 +45,7 @@ export interface UnmanagedSkill {
name: string;
description?: string;
foundIn: string[];
path: string;
}
/** 技能对象(兼容旧 API */

View File

@@ -33,6 +33,7 @@ export const settingsSchema = z.object({
webdavSync: z
.object({
enabled: z.boolean().optional(),
autoSync: z.boolean().optional(),
baseUrl: z.string().trim().optional().or(z.literal("")),
username: z.string().trim().optional().or(z.literal("")),
password: z.string().optional(),
@@ -42,6 +43,7 @@ export const settingsSchema = z.object({
.object({
lastSyncAt: z.number().nullable().optional(),
lastError: z.string().nullable().optional(),
lastErrorSource: z.string().nullable().optional(),
lastRemoteEtag: z.string().nullable().optional(),
lastLocalManifestHash: z.string().nullable().optional(),
lastRemoteManifestHash: z.string().nullable().optional(),

View File

@@ -170,6 +170,7 @@ export interface VisibleApps {
export interface WebDavSyncStatus {
lastSyncAt?: number | null;
lastError?: string | null;
lastErrorSource?: string | null;
lastRemoteEtag?: string | null;
lastLocalManifestHash?: string | null;
lastRemoteManifestHash?: string | null;
@@ -178,6 +179,7 @@ export interface WebDavSyncStatus {
// WebDAV v2 同步配置
export interface WebDavSyncSettings {
enabled?: boolean;
autoSync?: boolean;
baseUrl?: string;
username?: string;
password?: string;

View File

@@ -34,6 +34,17 @@ vi.mock("@/components/ui/input", () => ({
Input: (props: any) => <input {...props} />,
}));
vi.mock("@/components/ui/switch", () => ({
Switch: ({ checked, onCheckedChange, ...props }: any) => (
<button
role="switch"
aria-checked={checked}
onClick={() => onCheckedChange?.(!checked)}
{...props}
/>
),
}));
vi.mock("@/components/ui/select", () => ({
Select: ({ value, onValueChange, children }: any) => (
<select
@@ -82,6 +93,7 @@ const baseConfig: WebDavSyncSettings = {
password: "secret",
remoteRoot: "cc-switch-sync",
profile: "default",
autoSync: false,
status: {},
};
@@ -128,6 +140,49 @@ describe("WebdavSyncSection", () => {
settingsApiMock.webdavSyncDownload.mockResolvedValue({ status: "downloaded" });
});
it("shows auto sync error callout when last auto sync failed", () => {
renderSection({
...baseConfig,
status: {
lastError: "network timeout",
lastErrorSource: "auto",
},
});
expect(
screen.getByText("settings.webdavSync.autoSyncLastErrorTitle"),
).toBeInTheDocument();
expect(screen.getByText("network timeout")).toBeInTheDocument();
});
it("does not show auto sync error callout for manual sync errors", () => {
renderSection({
...baseConfig,
status: {
lastError: "manual upload failed",
lastErrorSource: "manual",
},
});
expect(
screen.queryByText("settings.webdavSync.autoSyncLastErrorTitle"),
).not.toBeInTheDocument();
});
it("does not show auto sync error callout when source is missing", () => {
renderSection({
...baseConfig,
autoSync: true,
status: {
lastError: "legacy error without source",
},
});
expect(
screen.queryByText("settings.webdavSync.autoSyncLastErrorTitle"),
).not.toBeInTheDocument();
});
it("shows validation error when saving without base url", async () => {
renderSection({ ...baseConfig, baseUrl: "" });
@@ -150,6 +205,7 @@ describe("WebdavSyncSection", () => {
baseUrl: "https://dav.example.com/dav/",
username: "alice",
password: "secret",
autoSync: false,
}),
false,
);
@@ -166,6 +222,24 @@ describe("WebdavSyncSection", () => {
);
});
it("saves auto sync as true after toggle", async () => {
renderSection(baseConfig);
fireEvent.click(
screen.getByRole("switch", { name: "settings.webdavSync.autoSync" }),
);
fireEvent.click(screen.getByRole("button", { name: "settings.webdavSync.save" }));
await waitFor(() => {
expect(settingsApiMock.webdavSyncSaveSettings).toHaveBeenCalledWith(
expect.objectContaining({
autoSync: true,
}),
false,
);
});
});
it("blocks upload when there are unsaved changes", async () => {
renderSection(baseConfig);

View File

@@ -209,4 +209,25 @@ describe("App integration with MSW", () => {
expect(toastErrorMock).not.toHaveBeenCalled();
expect(toastSuccessMock).toHaveBeenCalled();
});
it("shows toast when auto sync fails in background", async () => {
const { default: App } = await import("@/App");
renderApp(App);
await waitFor(() =>
expect(screen.getByTestId("provider-list").textContent).toContain(
"claude-1",
),
);
emitTauriEvent("webdav-sync-status-updated", {
source: "auto",
status: "error",
error: "network timeout",
});
await waitFor(() => {
expect(toastErrorMock).toHaveBeenCalled();
});
});
});