Compare commits

...

4 Commits

Author SHA1 Message Date
zerone0x
2937eb6766 fix(proxy): remove permissive CORS layer (#1915) 2026-04-13 12:26:19 +08:00
Dex Miller
313a6e3f6c [codex] Preserve cache_control when merging system prompts (#1946)
* Preserve cache hints when collapsing system prompts

Strict OpenAI-compatible chat backends still need fragmented Claude\nsystem prompts collapsed into one leading system message, but that\nnormalization should not silently drop stable cache hints. Preserve\nmessage-level cache_control when the merged system fragments agree,\nand fall back to omitting it when the fragments conflict.\n\nConstraint: Must keep single-system normalization for Nvidia/Qwen-style chat backends\nRejected: Always copy the first cache_control | could misrepresent conflicting cache boundaries\nConfidence: high\nScope-risk: narrow\nReversibility: clean\nDirective: If system prompt merging changes again, preserve cache_control whenever the merged metadata is unambiguous\nTested: cargo test proxy::providers::transform --manifest-path src-tauri/Cargo.toml\nNot-tested: End-to-end prompt caching behavior against cache-aware OpenAI-compatible upstreams\nRelated: #1881

* Tighten cache hint inheritance for merged system prompts

The follow-up cache hint fix still treated mixed present/absent\ncache_control across fragmented system prompts as inheritable, which\nexpanded the cache scope after prompt collapse. Treat that mix as\nambiguous and only preserve cache_control when every merged fragment\nexplicitly agrees on the same value.\n\nConstraint: Must preserve strict-backend system prompt normalization from #1942\nRejected: Inherit first present cache_control | widens cache scope when later fragments were intentionally uncached\nConfidence: high\nScope-risk: narrow\nReversibility: clean\nDirective: Any future merged-system cache hint logic should treat missing cache_control as semantically significant\nTested: cargo test proxy::providers::transform --manifest-path src-tauri/Cargo.toml\nNot-tested: End-to-end upstream caching behavior against cache-aware relays\nRelated: #1881\nRelated: #1946

* Keep cache-control merge regressions easy to review

Reflow the two long cache-control regression assertions in transform.rs so the neighboring merge cases stay rustfmt-aligned and easier to scan.

This keeps the preserved code change separate from the untracked Markdown design notes the user did not want committed.

Constraint: Exclude Markdown design files from the commit while preserving the local code change
Rejected: Include docs in the same commit | user explicitly asked to leave Markdown files out
Confidence: high
Scope-risk: narrow
Reversibility: clean
Directive: Treat this as a readability-only test change; do not infer runtime behavior changes from it
Tested: cargo test --manifest-path src-tauri/Cargo.toml test_anthropic_to_openai_drops_ --lib
Tested: cargo check --manifest-path src-tauri/Cargo.toml --tests
Tested: pnpm format:check
Tested: pnpm typecheck
Not-tested: Full application integration and manual flows
2026-04-13 10:42:29 +08:00
Dex Miller
5566be2b4b Stop sending prompt cache keys on Claude chat conversions (#2003)
Responses conversions still use promptCacheKey, but chat completions now stay a pure shape transform. This keeps Claude -> chat requests aligned with providers that do not understand the field and keeps stream checks consistent with production behavior.

Constraint: Issue #1919 requires removing prompt_cache_key from Claude -> OpenAI Chat requests
Rejected: Add a runtime toggle for chat injection | requested behavior is unconditional removal
Confidence: high
Scope-risk: narrow
Reversibility: clean
Directive: Keep promptCacheKey limited to Claude -> Responses conversions unless a provider-specific contract is proven
Tested: cargo test anthropic_to_openai
Tested: cargo test anthropic_to_responses_with_cache_key
Tested: cargo test transform_claude_request_for_api_format_responses
Not-tested: Full src-tauri test suite
Related: #1919
2026-04-13 10:22:55 +08:00
v2v
cfcf9452d0 添加应用级别窗口按钮,以改善linux wayland下系统窗口按钮失效的问题 (#1119)
* feat(window): add app-level window controls with settings toggle

Add a persistent settings toggle to enable app-level minimize/maximize/close controls and hide system decorations when enabled, providing a Wayland-friendly fallback for broken native titlebar interactions.

Co-authored-by: Cursor <cursoragent@cursor.com>

* fix(window): restrict app-level window controls to Linux only and fix startup flicker

- Guard useAppWindowControls with isLinux() in App.tsx so it's always
  false on macOS/Windows even if persisted as true
- Wrap set_decorations call in lib.rs with #[cfg(target_os = "linux")]
- Only show the toggle in WindowSettings on Linux
- Skip setDecorations effect while settingsData is still loading to
  prevent the Rust-side decoration state from being overridden by the
  undefined->false fallback, which caused a brief title bar flicker

---------

Co-authored-by: wzk <wx13571681304@outlook.com>
Co-authored-by: Cursor <cursoragent@cursor.com>
Co-authored-by: Jason <farion1231@gmail.com>
2026-04-12 20:59:04 +08:00
15 changed files with 301 additions and 81 deletions

View File

@@ -11,6 +11,11 @@
"updater:default",
"core:window:allow-set-skip-taskbar",
"core:window:allow-start-dragging",
"core:window:allow-minimize",
"core:window:allow-toggle-maximize",
"core:window:allow-is-maximized",
"core:window:allow-close",
"core:window:allow-set-decorations",
"process:allow-restart",
"dialog:default"
]

View File

@@ -971,6 +971,10 @@ pub fn run() {
// 静默启动:根据设置决定是否显示主窗口
let settings = crate::settings::get_settings();
if let Some(window) = app.get_webview_window("main") {
// 在窗口首次显示前同步装饰状态,避免前端加载后再切换导致标题栏闪烁
// 仅 Linux 生效:解决 Wayland 下系统窗口按钮不可用的问题
#[cfg(target_os = "linux")]
let _ = window.set_decorations(!settings.use_app_window_controls);
if settings.silent_startup {
// 静默启动模式:保持窗口隐藏
let _ = window.hide();

View File

@@ -282,9 +282,9 @@ pub struct ProviderMeta {
/// 是否将 base_url 视为完整 API 端点(不拼接 endpoint 路径)
#[serde(rename = "isFullUrl", skip_serializing_if = "Option::is_none")]
pub is_full_url: Option<bool>,
/// Prompt cache key for OpenAI-compatible endpoints.
/// When set, injected into converted requests to improve cache hit rate.
/// If not set, provider ID is used automatically during format conversion.
/// Prompt cache key for OpenAI Responses-compatible endpoints.
/// When set, injected into converted Responses requests to improve cache hit rate.
/// If not set, provider ID is used automatically during Claude -> Responses conversion.
#[serde(rename = "promptCacheKey", skip_serializing_if = "Option::is_none")]
pub prompt_cache_key: Option<String>,
/// 累加模式应用中,该 provider 是否已写入 live config。

View File

@@ -81,14 +81,13 @@ pub fn transform_claude_request_for_api_format(
provider: &Provider,
api_format: &str,
) -> Result<serde_json::Value, ProxyError> {
let cache_key = provider
.meta
.as_ref()
.and_then(|m| m.prompt_cache_key.as_deref())
.unwrap_or(&provider.id);
match api_format {
"openai_responses" => {
let cache_key = provider
.meta
.as_ref()
.and_then(|m| m.prompt_cache_key.as_deref())
.unwrap_or(&provider.id);
// Codex OAuth (ChatGPT Plus/Pro 反代) 需要在请求体里强制 store: false
// + include: ["reasoning.encrypted_content"],由 transform 层统一处理。
let is_codex_oauth = provider
@@ -102,7 +101,7 @@ pub fn transform_claude_request_for_api_format(
is_codex_oauth,
)
}
"openai_chat" => super::transform::anthropic_to_openai(body, Some(cache_key)),
"openai_chat" => super::transform::anthropic_to_openai(body),
_ => Ok(body),
}
}

View File

@@ -71,10 +71,8 @@ pub fn resolve_reasoning_effort(body: &Value) -> Option<&'static str> {
}
}
/// Anthropic 请求 → OpenAI 请求
///
/// `cache_key`: optional prompt_cache_key to inject for improved cache routing
pub fn anthropic_to_openai(body: Value, cache_key: Option<&str>) -> Result<Value, ProxyError> {
/// Anthropic 请求 → OpenAI Chat Completions 请求
pub fn anthropic_to_openai(body: Value) -> Result<Value, ProxyError> {
let mut result = json!({});
// NOTE: 模型映射由上游统一处理proxy::model_mapper格式转换层只做结构转换。
@@ -175,11 +173,6 @@ pub fn anthropic_to_openai(body: Value, cache_key: Option<&str>) -> Result<Value
result["tool_choice"] = v.clone();
}
// Inject prompt_cache_key for improved cache routing on OpenAI-compatible endpoints
if let Some(key) = cache_key {
result["prompt_cache_key"] = json!(key);
}
Ok(result)
}
@@ -206,6 +199,10 @@ fn normalize_openai_system_messages(messages: &mut Vec<Value>) {
}
let mut parts = Vec::new();
let mut inherited_cache_control: Option<Value> = None;
let mut cache_control_conflict = false;
let mut saw_cache_control = false;
let mut saw_missing_cache_control = false;
messages.retain(|message| {
if message.get("role").and_then(|value| value.as_str()) != Some("system") {
return true;
@@ -226,11 +223,28 @@ fn normalize_openai_system_messages(messages: &mut Vec<Value>) {
_ => {}
}
if let Some(cache_control) = message.get("cache_control") {
saw_cache_control = true;
match &inherited_cache_control {
None => inherited_cache_control = Some(cache_control.clone()),
Some(existing) if existing == cache_control => {}
Some(_) => cache_control_conflict = true,
}
} else {
saw_missing_cache_control = true;
}
false
});
if !parts.is_empty() {
messages.insert(0, json!({"role": "system", "content": parts.join("\n")}));
let mut merged = json!({"role": "system", "content": parts.join("\n")});
if !(cache_control_conflict || (saw_cache_control && saw_missing_cache_control)) {
if let Some(cache_control) = inherited_cache_control {
merged["cache_control"] = cache_control;
}
}
messages.insert(0, merged);
}
}
@@ -569,7 +583,7 @@ mod tests {
"messages": [{"role": "user", "content": "Hello"}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
assert_eq!(result["model"], "claude-3-opus");
assert_eq!(result["max_tokens"], 1024);
assert_eq!(result["messages"][0]["role"], "user");
@@ -585,7 +599,7 @@ mod tests {
"messages": [{"role": "user", "content": "Hello"}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
assert_eq!(result["messages"][0]["role"], "system");
assert_eq!(
result["messages"][0]["content"],
@@ -607,36 +621,76 @@ mod tests {
}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
assert_eq!(result["tools"][0]["type"], "function");
assert_eq!(result["tools"][0]["function"]["name"], "get_weather");
}
#[test]
fn test_anthropic_to_openai_normalizes_fragmented_system_messages() {
fn test_anthropic_to_openai_preserves_matching_system_cache_control_when_merging() {
let input = json!({
"model": "claude-3-sonnet",
"max_tokens": 1024,
"system": [
{"type": "text", "text": "You are Claude Code."},
{"type": "text", "text": "Be concise."}
{"type": "text", "text": "You are Claude Code.", "cache_control": {"type": "ephemeral"}},
{"type": "text", "text": "Be concise.", "cache_control": {"type": "ephemeral"}}
],
"messages": [
{"role": "system", "content": "Follow repo conventions."},
{"role": "user", "content": "Hello"}
]
"messages": [{"role": "user", "content": "Hello"}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
assert_eq!(result["messages"].as_array().unwrap().len(), 2);
assert_eq!(result["messages"][0]["role"], "system");
assert_eq!(
result["messages"][0]["content"],
"You are Claude Code.\nBe concise.\nFollow repo conventions."
"You are Claude Code.\nBe concise."
);
assert_eq!(result["messages"][0]["cache_control"]["type"], "ephemeral");
assert_eq!(result["messages"][1]["role"], "user");
}
#[test]
fn test_anthropic_to_openai_drops_mixed_present_absent_system_cache_control_when_merging() {
let input = json!({
"model": "claude-3-sonnet",
"max_tokens": 1024,
"system": [
{"type": "text", "text": "You are Claude Code.", "cache_control": {"type": "ephemeral"}},
{"type": "text", "text": "Be concise."}
],
"messages": [{"role": "user", "content": "Hello"}]
});
let result = anthropic_to_openai(input, None).unwrap();
assert_eq!(result["messages"][0]["role"], "system");
assert_eq!(
result["messages"][0]["content"],
"You are Claude Code.\nBe concise."
);
assert!(result["messages"][0].get("cache_control").is_none());
}
#[test]
fn test_anthropic_to_openai_drops_conflicting_system_cache_control_when_merging() {
let input = json!({
"model": "claude-3-sonnet",
"max_tokens": 1024,
"system": [
{"type": "text", "text": "You are Claude Code.", "cache_control": {"type": "ephemeral"}},
{"type": "text", "text": "Be concise.", "cache_control": {"type": "ephemeral", "ttl": "5m"}}
],
"messages": [{"role": "user", "content": "Hello"}]
});
let result = anthropic_to_openai(input, None).unwrap();
assert_eq!(result["messages"][0]["role"], "system");
assert_eq!(
result["messages"][0]["content"],
"You are Claude Code.\nBe concise."
);
assert!(result["messages"][0].get("cache_control").is_none());
}
#[test]
fn test_anthropic_to_openai_tool_use() {
let input = json!({
@@ -651,7 +705,7 @@ mod tests {
}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
let msg = &result["messages"][0];
assert_eq!(msg["role"], "assistant");
assert!(msg.get("tool_calls").is_some());
@@ -671,7 +725,7 @@ mod tests {
}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
let msg = &result["messages"][0];
assert_eq!(msg["role"], "tool");
assert_eq!(msg["tool_call_id"], "call_123");
@@ -743,31 +797,19 @@ mod tests {
"messages": [{"role": "user", "content": "Hello"}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
assert_eq!(result["model"], "gpt-4o");
}
#[test]
fn test_anthropic_to_openai_with_cache_key() {
fn test_anthropic_to_openai_does_not_inject_prompt_cache_key() {
let input = json!({
"model": "claude-3-opus",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello"}]
});
let result = anthropic_to_openai(input, Some("provider-123")).unwrap();
assert_eq!(result["prompt_cache_key"], "provider-123");
}
#[test]
fn test_anthropic_to_openai_no_cache_key() {
let input = json!({
"model": "claude-3-opus",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello"}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
assert!(result.get("prompt_cache_key").is_none());
}
@@ -793,7 +835,7 @@ mod tests {
}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
// System message cache_control preserved
assert_eq!(result["messages"][0]["cache_control"]["type"], "ephemeral");
// Text block cache_control preserved
@@ -1047,7 +1089,7 @@ mod tests {
"messages": [{"role": "user", "content": "Hello"}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
assert!(result.get("reasoning_effort").is_none());
}
@@ -1060,7 +1102,7 @@ mod tests {
"messages": [{"role": "user", "content": "Hello"}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
assert_eq!(result["reasoning_effort"], "medium");
}
@@ -1073,7 +1115,7 @@ mod tests {
"messages": [{"role": "user", "content": "Hello"}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
assert_eq!(result["reasoning_effort"], "xhigh");
}
@@ -1086,7 +1128,7 @@ mod tests {
"messages": [{"role": "user", "content": "Hello"}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
assert_eq!(result["reasoning_effort"], "low");
}
@@ -1099,7 +1141,7 @@ mod tests {
"messages": [{"role": "user", "content": "Hello"}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
assert_eq!(result["reasoning_effort"], "xhigh");
}
@@ -1111,7 +1153,7 @@ mod tests {
"messages": [{"role": "user", "content": "Hello"}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
assert!(result.get("reasoning_effort").is_none());
}
@@ -1124,7 +1166,7 @@ mod tests {
"messages": [{"role": "user", "content": "Hello"}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
assert!(
result.get("max_tokens").is_none(),
"{model} should not have max_tokens"
@@ -1144,7 +1186,7 @@ mod tests {
"messages": [{"role": "user", "content": "Hello"}]
});
let result = anthropic_to_openai(input, None).unwrap();
let result = anthropic_to_openai(input).unwrap();
assert_eq!(result["max_tokens"], 1024);
assert!(result.get("max_completion_tokens").is_none());
}

View File

@@ -23,7 +23,6 @@ use std::net::SocketAddr;
use std::sync::Arc;
use tokio::sync::{oneshot, RwLock};
use tokio::task::JoinHandle;
use tower_http::cors::{Any, CorsLayer};
/// 代理服务器状态(共享)
#[derive(Clone)]
@@ -275,11 +274,6 @@ impl ProxyServer {
}
fn build_router(&self) -> Router {
let cors = CorsLayer::new()
.allow_origin(Any)
.allow_methods(Any)
.allow_headers(Any);
Router::new()
// 健康检查
.route("/health", get(handlers::health_check))
@@ -328,7 +322,6 @@ impl ProxyServer {
.route("/gemini/v1beta/*path", post(handlers::handle_gemini))
// 提高默认请求体大小限制(避免 413 Payload Too Large
.layer(DefaultBodyLimit::max(200 * 1024 * 1024))
.layer(cors)
.with_state(self.state.clone())
}

View File

@@ -372,7 +372,7 @@ impl StreamCheckService {
anthropic_to_responses(anthropic_body, Some(&provider.id), is_codex_oauth)
.map_err(|e| AppError::Message(format!("Failed to build test request: {e}")))?
} else if is_openai_chat {
anthropic_to_openai(anthropic_body, Some(&provider.id))
anthropic_to_openai(anthropic_body)
.map_err(|e| AppError::Message(format!("Failed to build test request: {e}")))?
} else {
anthropic_body

View File

@@ -175,6 +175,8 @@ pub struct AppSettings {
pub show_in_tray: bool,
#[serde(default = "default_minimize_to_tray_on_close")]
pub minimize_to_tray_on_close: bool,
#[serde(default)]
pub use_app_window_controls: bool,
/// 是否启用 Claude 插件联动
#[serde(default)]
pub enable_claude_plugin_integration: bool,
@@ -293,6 +295,7 @@ impl Default for AppSettings {
Self {
show_in_tray: true,
minimize_to_tray_on_close: true,
use_app_window_controls: false,
enable_claude_plugin_integration: false,
skip_claude_onboarding: false,
launch_on_startup: false,

View File

@@ -9,6 +9,10 @@ import {
Plus,
Settings,
ArrowLeft,
Minus,
Maximize2,
Minimize2,
X,
Book,
Wrench,
RefreshCw,
@@ -22,6 +26,7 @@ import {
Shield,
Cpu,
} from "lucide-react";
import { getCurrentWindow } from "@tauri-apps/api/window";
import type { Provider, VisibleApps } from "@/types";
import type { EnvConflict } from "@/types/env";
import { useProvidersQuery, useSettingsQuery } from "@/lib/query";
@@ -99,9 +104,8 @@ interface WebDavSyncStatusUpdatedPayload {
error?: string;
}
const DRAG_BAR_HEIGHT = isWindows() || isLinux() ? 0 : 28; // px
const DEFAULT_DRAG_BAR_HEIGHT = isWindows() || isLinux() ? 0 : 28; // px
const HEADER_HEIGHT = 64; // px
const CONTENT_TOP_OFFSET = DRAG_BAR_HEIGHT + HEADER_HEIGHT;
const STORAGE_KEY = "cc-switch-last-app";
const VALID_APPS: AppId[] = [
@@ -153,12 +157,17 @@ function App() {
const [currentView, setCurrentView] = useState<View>(getInitialView);
const [settingsDefaultTab, setSettingsDefaultTab] = useState("general");
const [isAddOpen, setIsAddOpen] = useState(false);
const [isWindowMaximized, setIsWindowMaximized] = useState(false);
useEffect(() => {
localStorage.setItem(VIEW_STORAGE_KEY, currentView);
}, [currentView]);
const { data: settingsData } = useSettingsQuery();
const useAppWindowControls =
isLinux() && (settingsData?.useAppWindowControls ?? false);
const dragBarHeight = useAppWindowControls ? 32 : DEFAULT_DRAG_BAR_HEIGHT;
const contentTopOffset = dragBarHeight + HEADER_HEIGHT;
const visibleApps: VisibleApps = settingsData?.visibleApps ?? {
claude: true,
codex: true,
@@ -392,6 +401,51 @@ function App() {
};
}, [queryClient, t]);
useEffect(() => {
let active = true;
let unlistenResize: (() => void) | undefined;
const setupWindowStateSync = async () => {
try {
const currentWindow = getCurrentWindow();
const syncWindowMaximizedState = async () => {
const maximized = await currentWindow.isMaximized();
if (active) {
setIsWindowMaximized(maximized);
}
};
await syncWindowMaximizedState();
unlistenResize = await currentWindow.onResized(() => {
void syncWindowMaximizedState();
});
} catch (error) {
console.error("[App] Failed to sync window maximized state", error);
}
};
void setupWindowStateSync();
return () => {
active = false;
unlistenResize?.();
};
}, []);
useEffect(() => {
// settingsData 未加载时跳过,避免用 fallback false 覆盖 Rust 侧已设好的装饰状态
if (!settingsData) return;
const syncWindowDecorations = async () => {
try {
await getCurrentWindow().setDecorations(!useAppWindowControls);
} catch (error) {
console.error("[App] Failed to update window decorations", error);
}
};
void syncWindowDecorations();
}, [useAppWindowControls, settingsData]);
useEffect(() => {
const checkEnvOnStartup = async () => {
try {
@@ -734,6 +788,44 @@ function App() {
}
};
const notifyWindowControlError = (error: unknown) => {
toast.error(
t("notifications.windowControlFailed", {
defaultValue: "窗口控制失败:{{error}}",
error: extractErrorMessage(error),
}),
);
};
const handleWindowMinimize = async () => {
try {
await getCurrentWindow().minimize();
} catch (error) {
console.error("[App] Failed to minimize window", error);
notifyWindowControlError(error);
}
};
const handleWindowToggleMaximize = async () => {
try {
const currentWindow = getCurrentWindow();
await currentWindow.toggleMaximize();
setIsWindowMaximized(await currentWindow.isMaximized());
} catch (error) {
console.error("[App] Failed to toggle maximize", error);
notifyWindowControlError(error);
}
};
const handleWindowClose = async () => {
try {
await getCurrentWindow().close();
} catch (error) {
console.error("[App] Failed to close window", error);
notifyWindowControlError(error);
}
};
const renderContent = () => {
const content = (() => {
switch (currentView) {
@@ -880,14 +972,57 @@ function App() {
return (
<div
className="flex flex-col h-screen overflow-hidden bg-background text-foreground selection:bg-primary/30"
style={{ overflowX: "hidden", paddingTop: CONTENT_TOP_OFFSET }}
style={{ overflowX: "hidden", paddingTop: contentTopOffset }}
>
{DRAG_BAR_HEIGHT > 0 && (
{(dragBarHeight > 0 || useAppWindowControls) && (
<div
className="fixed top-0 left-0 right-0 z-[60]"
className="fixed top-0 left-0 right-0 z-[70] flex items-center justify-end px-2"
data-tauri-drag-region
style={{ WebkitAppRegion: "drag", height: DRAG_BAR_HEIGHT } as any}
/>
style={{ WebkitAppRegion: "drag", height: dragBarHeight } as any}
>
{useAppWindowControls && (
<div
className="flex items-center gap-1"
style={{ WebkitAppRegion: "no-drag" } as any}
>
<Button
variant="ghost"
size="icon"
onClick={() => void handleWindowMinimize()}
title={t("header.windowMinimize")}
className="h-7 w-7"
>
<Minus className="w-4 h-4" />
</Button>
<Button
variant="ghost"
size="icon"
onClick={() => void handleWindowToggleMaximize()}
title={
isWindowMaximized
? t("header.windowRestore")
: t("header.windowMaximize")
}
className="h-7 w-7"
>
{isWindowMaximized ? (
<Minimize2 className="w-4 h-4" />
) : (
<Maximize2 className="w-4 h-4" />
)}
</Button>
<Button
variant="ghost"
size="icon"
onClick={() => void handleWindowClose()}
title={t("header.windowClose")}
className="h-7 w-7 hover:bg-red-500/15 hover:text-red-500"
>
<X className="w-4 h-4" />
</Button>
</div>
)}
</div>
)}
{showEnvBanner && envConflicts.length > 0 && (
<EnvWarningBanner
@@ -920,7 +1055,7 @@ function App() {
style={
{
...DRAG_REGION_STYLE,
top: DRAG_BAR_HEIGHT,
top: dragBarHeight,
height: HEADER_HEIGHT,
} as any
}

View File

@@ -3,6 +3,7 @@ import type { SettingsFormState } from "@/hooks/useSettings";
import { AppWindow, MonitorUp, Power, EyeOff } from "lucide-react";
import { ToggleRow } from "@/components/ui/toggle-row";
import { AnimatePresence, motion } from "framer-motion";
import { isLinux } from "@/lib/platform";
interface WindowSettingsProps {
settings: SettingsFormState;
@@ -75,6 +76,18 @@ export function WindowSettings({ settings, onChange }: WindowSettingsProps) {
onChange({ minimizeToTrayOnClose: value })
}
/>
{isLinux() && (
<ToggleRow
icon={<AppWindow className="h-4 w-4 text-amber-500" />}
title={t("settings.useAppWindowControls")}
description={t("settings.useAppWindowControlsDescription")}
checked={!!settings.useAppWindowControls}
onCheckedChange={(value) =>
onChange({ useAppWindowControls: value })
}
/>
)}
</div>
</section>
);

View File

@@ -81,6 +81,7 @@ export function useSettingsForm(): UseSettingsFormResult {
...data,
showInTray: data.showInTray ?? true,
minimizeToTrayOnClose: data.minimizeToTrayOnClose ?? true,
useAppWindowControls: data.useAppWindowControls ?? false,
enableClaudePluginIntegration:
data.enableClaudePluginIntegration ?? false,
silentStartup: data.silentStartup ?? false,
@@ -105,6 +106,7 @@ export function useSettingsForm(): UseSettingsFormResult {
({
showInTray: true,
minimizeToTrayOnClose: true,
useAppWindowControls: false,
enableClaudePluginIntegration: false,
skipClaudeOnboarding: false,
language: readPersistedLanguage(),
@@ -139,6 +141,7 @@ export function useSettingsForm(): UseSettingsFormResult {
...serverData,
showInTray: serverData.showInTray ?? true,
minimizeToTrayOnClose: serverData.minimizeToTrayOnClose ?? true,
useAppWindowControls: serverData.useAppWindowControls ?? false,
enableClaudePluginIntegration:
serverData.enableClaudePluginIntegration ?? false,
silentStartup: serverData.silentStartup ?? false,

View File

@@ -91,7 +91,11 @@
"switchToChinese": "Switch to Chinese",
"switchToEnglish": "Switch to English",
"enterEditMode": "Enter Edit Mode",
"exitEditMode": "Exit Edit Mode"
"exitEditMode": "Exit Edit Mode",
"windowMinimize": "Minimize window",
"windowMaximize": "Maximize window",
"windowRestore": "Restore window",
"windowClose": "Close window"
},
"provider": {
"tabProvider": "Provider",
@@ -201,7 +205,8 @@
"openclawDefaultModelSet": "Set as default model",
"openclawDefaultModelSetFailed": "Failed to set default model",
"openclawNoModels": "No models configured",
"backfillWarning": "Switched successfully, but failed to save changes back to the previous provider"
"backfillWarning": "Switched successfully, but failed to save changes back to the previous provider",
"windowControlFailed": "Window control failed: {{error}}"
},
"confirm": {
"deleteProvider": "Delete Provider",
@@ -494,6 +499,8 @@
"autoLaunchFailed": "Failed to set auto-launch",
"minimizeToTray": "Minimize to tray on close",
"minimizeToTrayDescription": "When checked, clicking the close button will hide to system tray, otherwise the app will exit directly.",
"useAppWindowControls": "Enable app-level window controls",
"useAppWindowControlsDescription": "Use built-in minimize, maximize/restore, and close buttons in the app header.",
"enableClaudePluginIntegration": "Apply to Claude Code extension",
"enableClaudePluginIntegrationDescription": "When enabled, the VS Code Claude Code extension provider will switch with this app",
"skipClaudeOnboarding": "Skip Claude Code first-run confirmation",

View File

@@ -91,7 +91,11 @@
"switchToChinese": "中国語に切り替え",
"switchToEnglish": "英語に切り替え",
"enterEditMode": "編集モードに入る",
"exitEditMode": "編集モードを終了"
"exitEditMode": "編集モードを終了",
"windowMinimize": "ウィンドウを最小化",
"windowMaximize": "ウィンドウを最大化",
"windowRestore": "ウィンドウを元に戻す",
"windowClose": "ウィンドウを閉じる"
},
"provider": {
"tabProvider": "プロバイダー",
@@ -201,7 +205,8 @@
"openclawDefaultModelSet": "デフォルトモデルに設定しました",
"openclawDefaultModelSetFailed": "デフォルトモデルの設定に失敗しました",
"openclawNoModels": "モデルが設定されていません",
"backfillWarning": "切り替え成功しましたが、前のプロバイダーへの設定保存に失敗しました"
"backfillWarning": "切り替え成功しましたが、前のプロバイダーへの設定保存に失敗しました",
"windowControlFailed": "ウィンドウ操作に失敗しました: {{error}}"
},
"confirm": {
"deleteProvider": "プロバイダーを削除",
@@ -494,6 +499,8 @@
"autoLaunchFailed": "自動起動の設定に失敗しました",
"minimizeToTray": "閉じるときトレイへ最小化",
"minimizeToTrayDescription": "チェックすると閉じるボタンでトレイに隠し、オフならアプリを終了します。",
"useAppWindowControls": "アプリ内ウィンドウボタンを有効化",
"useAppWindowControlsDescription": "有効にすると、アプリヘッダーに最小化・最大化/復元・閉じるボタンを表示します。",
"enableClaudePluginIntegration": "Claude Code 拡張に適用",
"enableClaudePluginIntegrationDescription": "オンにすると VS Code の Claude Code 拡張のプロバイダーも同期します",
"skipClaudeOnboarding": "Claude Code の初回確認をスキップ",

View File

@@ -91,7 +91,11 @@
"switchToChinese": "切换到中文",
"switchToEnglish": "切换到英文",
"enterEditMode": "进入编辑模式",
"exitEditMode": "退出编辑模式"
"exitEditMode": "退出编辑模式",
"windowMinimize": "最小化窗口",
"windowMaximize": "最大化窗口",
"windowRestore": "还原窗口",
"windowClose": "关闭窗口"
},
"provider": {
"tabProvider": "供应商",
@@ -201,7 +205,8 @@
"openclawDefaultModelSet": "已设为默认模型",
"openclawDefaultModelSetFailed": "设置默认模型失败",
"openclawNoModels": "该供应商没有配置模型",
"backfillWarning": "切换成功,但旧供应商配置回填失败,您手动修改的配置可能未保存"
"backfillWarning": "切换成功,但旧供应商配置回填失败,您手动修改的配置可能未保存",
"windowControlFailed": "窗口控制失败:{{error}}"
},
"confirm": {
"deleteProvider": "删除供应商",
@@ -494,6 +499,8 @@
"autoLaunchFailed": "设置开机自启失败",
"minimizeToTray": "关闭时最小化到托盘",
"minimizeToTrayDescription": "勾选后点击关闭按钮会隐藏到系统托盘,取消则直接退出应用。",
"useAppWindowControls": "启用应用级窗口按钮",
"useAppWindowControlsDescription": "开启后使用应用自建的最小化、最大化/还原、关闭按钮;关闭后沿用系统窗口模式。",
"enableClaudePluginIntegration": "应用到 Claude Code 插件",
"enableClaudePluginIntegrationDescription": "开启后 Vscode Claude Code 插件的供应商将随本软件切换",
"skipClaudeOnboarding": "跳过 Claude Code 初次安装确认",

View File

@@ -166,7 +166,7 @@ export interface ProviderMeta {
apiKeyField?: ClaudeApiKeyField;
// 是否将 base_url 视为完整 API 端点(代理直接使用此 URL不拼接路径
isFullUrl?: boolean;
// Prompt cache key for OpenAI-compatible endpoints (improves cache hit rate)
// Prompt cache key for OpenAI Responses-compatible endpoints (improves cache hit rate)
promptCacheKey?: string;
// 供应商类型(用于识别 Copilot 等特殊供应商)
providerType?: string;
@@ -244,6 +244,8 @@ export interface Settings {
showInTray: boolean;
// 点击关闭按钮时是否最小化到托盘而不是关闭应用
minimizeToTrayOnClose: boolean;
// 是否启用应用级窗口控制按钮(最小化/最大化/关闭)
useAppWindowControls?: boolean;
// 启用 Claude 插件联动(写入 ~/.claude/config.json 的 primaryApiKey
enableClaudePluginIntegration?: boolean;
// 跳过 Claude Code 初次安装确认(写入 ~/.claude.json 的 hasCompletedOnboarding