mirror of
https://github.com/ILoveBingLu/CipherTalk.git
synced 2026-05-16 17:29:30 +08:00
feat(ai): 新增Ollama本地AI和自定义服务提供商支持
- 新增 Ollama 本地 AI 提供商,支持本地模型运行和自定义服务地址配置 - 新增自定义(OpenAI 兼容)提供商,支持任何 OpenAI 兼容的 API 服务 - 添加 Ollama 和自定义服务使用指南文档,包含安装、配置和常见问题解答 - 新增 AI 服务指南读取 IPC 处理器,支持前端动态加载使用文档 - 更新 AI 摘要设置界面,新增 Ollama 和自定义服务的 Logo 资源 - 完善 AI 服务配置,支持自定义 baseURL 和模型选择 - 更新版本号至 2.1.3,更新项目依赖和构建配置 - 优化 .gitignore,添加 native-dlls 目录排除 - 提升用户隐私保护和服务灵活性,支持离线本地 AI 和自建服务
This commit is contained in:
@@ -48,4 +48,6 @@ Docs
|
||||
|
||||
# WeFolw
|
||||
WeFolw
|
||||
upx
|
||||
native-dlls
|
||||
MyCoolInstaller
|
||||
@@ -7,7 +7,7 @@
|
||||
**一款现代化的微信聊天记录查看与分析工具**
|
||||
|
||||
[](LICENSE)
|
||||
[](package.json)
|
||||
[](package.json)
|
||||
[]()
|
||||
[]()
|
||||
[]()
|
||||
@@ -296,6 +296,7 @@ export const useChatStore = create<ChatStore>((set) => ({
|
||||
|
||||
| 渠道 | 链接 |
|
||||
|:---:|:---|
|
||||
| 🌐 **官方网站** | [密语 CipherTalk](https://miyuapp.aiqji.com) |
|
||||
| 🐛 **问题反馈** | [GitHub Issues](https://github.com/ILoveBingLu/CipherTalk/issues) |
|
||||
| 💬 **讨论交流** | [GitHub Discussions](https://github.com/ILoveBingLu/CipherTalk/discussions) |
|
||||
| 📱 **Telegram 群组** | [加入群聊](https://t.me/+toZ7bY15IZo3NjVl) |
|
||||
@@ -310,7 +311,7 @@ export const useChatStore = create<ChatStore>((set) => ({
|
||||
感谢所有为开源社区做出贡献的开发者们!
|
||||
|
||||
特别感谢:
|
||||
- **[WeFlow](https://github.com/ILoveBingLu/WeFlow)** - 提供了部分功能参考
|
||||
- **[WeFlow](https://github.com/hicccc77/WeFlow)** - 提供了部分功能参考
|
||||
- **所有贡献者** - 感谢每一位为本项目做出贡献的开发者
|
||||
|
||||
---
|
||||
|
||||
@@ -2088,6 +2088,20 @@ function registerIpcHandlers() {
|
||||
}
|
||||
})
|
||||
|
||||
// 读取 AI 服务使用指南
|
||||
ipcMain.handle('ai:readGuide', async (_, guideName: string) => {
|
||||
try {
|
||||
const guidePath = join(__dirname, '../electron/services/ai', guideName)
|
||||
if (!existsSync(guidePath)) {
|
||||
return { success: false, error: '指南文件不存在' }
|
||||
}
|
||||
const content = readFileSync(guidePath, 'utf-8')
|
||||
return { success: true, content }
|
||||
} catch (e) {
|
||||
return { success: false, error: String(e) }
|
||||
}
|
||||
})
|
||||
|
||||
ipcMain.handle('ai:generateSummary', async (event, sessionId: string, timeRange: number, options: {
|
||||
provider: string
|
||||
apiKey: string
|
||||
|
||||
@@ -0,0 +1,119 @@
|
||||
# Ollama 本地 AI 使用指南
|
||||
|
||||
## 什么是 Ollama?
|
||||
|
||||
Ollama 是一个开源的本地大模型运行工具,可以在你的电脑上运行各种开源大模型,完全免费且数据不会上传到云端。
|
||||
|
||||
## 安装 Ollama
|
||||
|
||||
### Windows
|
||||
|
||||
1. 访问 [Ollama 官网](https://ollama.com/)
|
||||
2. 下载 Windows 安装包
|
||||
3. 运行安装程序
|
||||
4. 安装完成后,Ollama 会自动在后台运行
|
||||
|
||||
### 验证安装
|
||||
|
||||
打开命令行(CMD 或 PowerShell),输入:
|
||||
|
||||
```bash
|
||||
ollama --version
|
||||
```
|
||||
|
||||
如果显示版本号,说明安装成功。
|
||||
|
||||
## 下载模型
|
||||
|
||||
Ollama 支持多种开源模型,推荐以下几个:
|
||||
|
||||
### 1. Qwen2.5(通义千问)- 推荐
|
||||
```bash
|
||||
ollama pull qwen2.5:latest
|
||||
```
|
||||
|
||||
### 2. DeepSeek R1(深度求索)
|
||||
```bash
|
||||
ollama pull deepseek-r1:latest
|
||||
```
|
||||
|
||||
### 3. Llama 3.3(Meta)
|
||||
```bash
|
||||
ollama pull llama3.3:latest
|
||||
```
|
||||
|
||||
### 4. Gemma 2(Google)
|
||||
```bash
|
||||
ollama pull gemma2:latest
|
||||
```
|
||||
|
||||
## 在密语中配置 Ollama
|
||||
|
||||
1. 打开密语设置页面
|
||||
2. 切换到「AI 摘要」标签
|
||||
3. 选择「Ollama (本地)」提供商
|
||||
4. 配置项说明:
|
||||
- **API 密钥**:本地服务无需密钥,可留空或随意填写
|
||||
- **服务地址**:默认为 `http://localhost:11434/v1`,通常无需修改
|
||||
- **选择模型**:从下拉列表选择已下载的模型,或手动输入模型名称
|
||||
|
||||
5. 点击「测试连接」按钮验证配置
|
||||
6. 如果连接成功,即可开始使用
|
||||
|
||||
## 常见问题
|
||||
|
||||
### Q: 提示"Ollama 服务未启动"
|
||||
|
||||
**A:** 确保 Ollama 已安装并在后台运行。可以尝试:
|
||||
- 重启 Ollama 服务
|
||||
- 在命令行运行 `ollama serve`
|
||||
|
||||
### Q: 模型列表中没有我想要的模型
|
||||
|
||||
**A:** 你可以手动输入模型名称。Ollama 支持的所有模型可以在 [Ollama 模型库](https://ollama.com/library) 查看。
|
||||
|
||||
### Q: 生成摘要很慢
|
||||
|
||||
**A:** 本地运行模型的速度取决于你的硬件配置:
|
||||
- **CPU 模式**:速度较慢,适合小模型
|
||||
- **GPU 加速**:需要 NVIDIA 显卡,速度快很多
|
||||
|
||||
推荐配置:
|
||||
- 至少 8GB 内存
|
||||
- NVIDIA 显卡(可选,但强烈推荐)
|
||||
|
||||
### Q: 如何切换到 GPU 模式?
|
||||
|
||||
**A:** Ollama 会自动检测并使用 GPU。如果你有 NVIDIA 显卡且安装了 CUDA,Ollama 会自动使用 GPU 加速。
|
||||
|
||||
### Q: 修改了端口怎么办?
|
||||
|
||||
**A:** 如果你修改了 Ollama 的默认端口(11434),在密语的「服务地址」中修改为对应的地址即可,例如:
|
||||
- `http://localhost:8080/v1`
|
||||
- `http://192.168.1.100:11434/v1`(远程服务器)
|
||||
|
||||
## 优势
|
||||
|
||||
✅ **完全免费**:无需购买 API 密钥
|
||||
✅ **数据隐私**:所有数据在本地处理,不会上传
|
||||
✅ **离线可用**:无需网络连接
|
||||
✅ **多模型支持**:可以随时切换不同的模型
|
||||
|
||||
## 劣势
|
||||
|
||||
❌ **需要本地资源**:占用 CPU/GPU 和内存
|
||||
❌ **速度较慢**:相比云端 API,生成速度较慢
|
||||
❌ **模型质量**:开源模型效果可能不如商业模型
|
||||
|
||||
## 推荐使用场景
|
||||
|
||||
- 对数据隐私有要求
|
||||
- 不想付费使用 API
|
||||
- 有较好的硬件配置
|
||||
- 需要离线使用
|
||||
|
||||
## 更多信息
|
||||
|
||||
- [Ollama 官网](https://ollama.com/)
|
||||
- [Ollama GitHub](https://github.com/ollama/ollama)
|
||||
- [模型库](https://ollama.com/library)
|
||||
@@ -9,6 +9,8 @@ import { SiliconFlowProvider, SiliconFlowMetadata } from './providers/siliconflo
|
||||
import { XiaomiProvider, XiaomiMetadata } from './providers/xiaomi'
|
||||
import { OpenAIProvider, OpenAIMetadata } from './providers/openai'
|
||||
import { GeminiProvider, GeminiMetadata } from './providers/gemini'
|
||||
import { OllamaProvider, OllamaMetadata } from './providers/ollama'
|
||||
import { CustomProvider, CustomMetadata } from './providers/custom'
|
||||
import { AIProvider } from './providers/base'
|
||||
import type { Message, Contact } from '../chatService'
|
||||
import { voiceTranscribeService } from '../voiceTranscribeService'
|
||||
@@ -81,6 +83,8 @@ class AIService {
|
||||
*/
|
||||
getAllProviders() {
|
||||
return [
|
||||
CustomMetadata,
|
||||
OllamaMetadata,
|
||||
OpenAIMetadata,
|
||||
GeminiMetadata,
|
||||
DeepSeekMetadata,
|
||||
@@ -111,6 +115,19 @@ class AIService {
|
||||
}
|
||||
|
||||
switch (name) {
|
||||
case 'custom':
|
||||
// 自定义服务必须提供 baseURL
|
||||
const customConfig = this.configService.getAIProviderConfig('custom')
|
||||
const customBaseURL = customConfig?.baseURL
|
||||
if (!customBaseURL) {
|
||||
throw new Error('自定义服务需要配置服务地址')
|
||||
}
|
||||
return new CustomProvider(key || '', customBaseURL)
|
||||
case 'ollama':
|
||||
// Ollama 支持自定义 baseURL
|
||||
const ollamaConfig = this.configService.getAIProviderConfig('ollama')
|
||||
const baseURL = ollamaConfig?.baseURL || 'http://localhost:11434/v1'
|
||||
return new OllamaProvider(key || 'ollama', baseURL)
|
||||
case 'openai':
|
||||
return new OpenAIProvider(key)
|
||||
case 'gemini':
|
||||
|
||||
@@ -0,0 +1,116 @@
|
||||
import { BaseAIProvider } from './base'
|
||||
|
||||
/**
|
||||
* 自定义提供商元数据
|
||||
*/
|
||||
export const CustomMetadata = {
|
||||
id: 'custom',
|
||||
name: 'custom',
|
||||
displayName: '自定义(OpenAI 兼容)',
|
||||
description: '支持任何 OpenAI 兼容的 API 服务',
|
||||
models: [
|
||||
'gpt-4o',
|
||||
'gpt-4o-mini',
|
||||
'gpt-4-turbo',
|
||||
'gpt-3.5-turbo',
|
||||
'claude-3-5-sonnet-20241022',
|
||||
'claude-3-5-haiku-20241022',
|
||||
'gemini-2.0-flash-exp',
|
||||
'deepseek-chat',
|
||||
'qwen-plus',
|
||||
'custom-model'
|
||||
],
|
||||
pricing: '根据实际服务商定价',
|
||||
pricingDetail: {
|
||||
input: 0, // 自定义服务,价格未知
|
||||
output: 0 // 自定义服务,价格未知
|
||||
},
|
||||
website: '',
|
||||
logo: './AI-logo/custom.svg'
|
||||
}
|
||||
|
||||
/**
|
||||
* 自定义提供商
|
||||
* 支持任何 OpenAI 兼容的 API 服务
|
||||
* 例如:OneAPI、API2D、自建中转等
|
||||
*/
|
||||
export class CustomProvider extends BaseAIProvider {
|
||||
name = CustomMetadata.name
|
||||
displayName = CustomMetadata.displayName
|
||||
models = CustomMetadata.models
|
||||
pricing = CustomMetadata.pricingDetail
|
||||
|
||||
constructor(apiKey: string, baseURL: string) {
|
||||
// 自定义服务必须提供 baseURL
|
||||
super(apiKey, baseURL || 'https://api.openai.com/v1')
|
||||
}
|
||||
|
||||
/**
|
||||
* 测试连接 - 重写以提供更友好的错误提示
|
||||
*/
|
||||
async testConnection(): Promise<{ success: boolean; error?: string; needsProxy?: boolean }> {
|
||||
try {
|
||||
const client = await this.getClient()
|
||||
|
||||
// 创建超时 Promise
|
||||
const timeoutPromise = new Promise<never>((_, reject) => {
|
||||
setTimeout(() => reject(new Error('CONNECTION_TIMEOUT')), 15000) // 15秒超时
|
||||
})
|
||||
|
||||
// 竞速:API 请求 vs 超时
|
||||
await Promise.race([
|
||||
client.models.list(),
|
||||
timeoutPromise
|
||||
])
|
||||
|
||||
return { success: true }
|
||||
} catch (error: any) {
|
||||
const errorMessage = error?.message || String(error)
|
||||
console.error(`[${this.name}] 连接测试失败:`, errorMessage)
|
||||
|
||||
// 判断是否需要代理
|
||||
const needsProxy =
|
||||
errorMessage.includes('ECONNREFUSED') ||
|
||||
errorMessage.includes('ETIMEDOUT') ||
|
||||
errorMessage.includes('ENOTFOUND') ||
|
||||
errorMessage.includes('CONNECTION_TIMEOUT') ||
|
||||
errorMessage.includes('getaddrinfo') ||
|
||||
error?.code === 'ECONNREFUSED' ||
|
||||
error?.code === 'ETIMEDOUT' ||
|
||||
error?.code === 'ENOTFOUND'
|
||||
|
||||
// 构建错误提示
|
||||
let errorMsg = '连接失败'
|
||||
|
||||
if (errorMessage.includes('CONNECTION_TIMEOUT')) {
|
||||
errorMsg = '连接超时,请检查服务地址或开启代理'
|
||||
} else if (errorMessage.includes('ECONNREFUSED')) {
|
||||
errorMsg = '连接被拒绝,请检查服务地址是否正确'
|
||||
} else if (errorMessage.includes('ETIMEDOUT')) {
|
||||
errorMsg = '连接超时,请检查网络或开启代理'
|
||||
} else if (errorMessage.includes('ENOTFOUND') || errorMessage.includes('getaddrinfo')) {
|
||||
errorMsg = '无法解析域名,请检查服务地址'
|
||||
} else if (errorMessage.includes('401') || errorMessage.includes('Unauthorized')) {
|
||||
errorMsg = 'API Key 无效,请检查配置'
|
||||
} else if (errorMessage.includes('403') || errorMessage.includes('Forbidden')) {
|
||||
errorMsg = '访问被禁止,请检查 API Key 权限'
|
||||
} else if (errorMessage.includes('404')) {
|
||||
errorMsg = 'API 端点不存在,请检查服务地址(需包含 /v1)'
|
||||
} else if (errorMessage.includes('429')) {
|
||||
errorMsg = '请求过于频繁,请稍后再试'
|
||||
} else if (errorMessage.includes('500') || errorMessage.includes('502') || errorMessage.includes('503')) {
|
||||
errorMsg = '服务器错误,请稍后再试'
|
||||
} else if (needsProxy) {
|
||||
errorMsg = '网络连接失败,请检查服务地址或开启代理'
|
||||
} else {
|
||||
errorMsg = `连接失败: ${errorMessage}`
|
||||
}
|
||||
|
||||
return {
|
||||
success: false,
|
||||
error: errorMsg,
|
||||
needsProxy
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,96 @@
|
||||
import { BaseAIProvider } from './base'
|
||||
|
||||
/**
|
||||
* Ollama提供商元数据
|
||||
*/
|
||||
export const OllamaMetadata = {
|
||||
id: 'ollama',
|
||||
name: 'ollama',
|
||||
displayName: 'Ollama (本地)',
|
||||
description: '本地运行的开源大模型服务',
|
||||
models: [
|
||||
'qwen2.5:latest',
|
||||
'llama3.3:latest',
|
||||
'deepseek-r1:latest',
|
||||
'gemma2:latest',
|
||||
'mistral:latest',
|
||||
'phi4:latest',
|
||||
'qwen2.5-coder:latest'
|
||||
],
|
||||
pricing: '免费(本地运行)',
|
||||
pricingDetail: {
|
||||
input: 0, // 本地运行,无费用
|
||||
output: 0 // 本地运行,无费用
|
||||
},
|
||||
website: 'https://ollama.com/',
|
||||
logo: './AI-logo/ollama.svg'
|
||||
}
|
||||
|
||||
/**
|
||||
* Ollama提供商
|
||||
* 支持本地运行的 Ollama 服务
|
||||
*/
|
||||
export class OllamaProvider extends BaseAIProvider {
|
||||
name = OllamaMetadata.name
|
||||
displayName = OllamaMetadata.displayName
|
||||
models = OllamaMetadata.models
|
||||
pricing = OllamaMetadata.pricingDetail
|
||||
|
||||
constructor(apiKey: string = 'ollama', baseURL?: string) {
|
||||
// Ollama 默认运行在 http://localhost:11434
|
||||
// apiKey 对于 Ollama 不是必需的,但为了保持接口一致性,我们接受它
|
||||
super(apiKey, baseURL || 'http://localhost:11434/v1')
|
||||
}
|
||||
|
||||
/**
|
||||
* 测试连接 - 重写以适配 Ollama
|
||||
*/
|
||||
async testConnection(): Promise<{ success: boolean; error?: string; needsProxy?: boolean }> {
|
||||
try {
|
||||
const client = await this.getClient()
|
||||
|
||||
// 创建超时 Promise
|
||||
const timeoutPromise = new Promise<never>((_, reject) => {
|
||||
setTimeout(() => reject(new Error('CONNECTION_TIMEOUT')), 10000) // 10秒超时
|
||||
})
|
||||
|
||||
// Ollama 使用 /api/tags 端点获取模型列表
|
||||
// 但由于我们使用 OpenAI 兼容接口,尝试列出模型
|
||||
await Promise.race([
|
||||
client.models.list(),
|
||||
timeoutPromise
|
||||
])
|
||||
|
||||
return { success: true }
|
||||
} catch (error: any) {
|
||||
const errorMessage = error?.message || String(error)
|
||||
console.error(`[${this.name}] 连接测试失败:`, errorMessage)
|
||||
|
||||
// Ollama 是本地服务,不需要代理
|
||||
const needsProxy = false
|
||||
|
||||
// 构建错误提示
|
||||
let errorMsg = '连接失败'
|
||||
|
||||
if (errorMessage.includes('CONNECTION_TIMEOUT')) {
|
||||
errorMsg = '连接超时,请确认 Ollama 服务已启动(默认端口 11434)'
|
||||
} else if (errorMessage.includes('ECONNREFUSED')) {
|
||||
errorMsg = 'Ollama 服务未启动,请先运行 "ollama serve"'
|
||||
} else if (errorMessage.includes('ETIMEDOUT')) {
|
||||
errorMsg = '连接超时,请检查 Ollama 服务状态'
|
||||
} else if (errorMessage.includes('ENOTFOUND') || errorMessage.includes('getaddrinfo')) {
|
||||
errorMsg = '无法连接到 Ollama 服务,请检查地址配置'
|
||||
} else if (errorMessage.includes('404')) {
|
||||
errorMsg = 'Ollama API 端点不存在,请检查服务版本'
|
||||
} else {
|
||||
errorMsg = `连接失败: ${errorMessage}。请确认 Ollama 已安装并运行`
|
||||
}
|
||||
|
||||
return {
|
||||
success: false,
|
||||
error: errorMsg,
|
||||
needsProxy
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,205 @@
|
||||
# 自定义 AI 服务使用指南
|
||||
|
||||
## 什么是自定义 AI 服务?
|
||||
|
||||
自定义 AI 服务支持任何兼容 OpenAI API 格式的第三方服务,包括但不限于:
|
||||
|
||||
- **API 中转服务**:OneAPI、API2D、OpenAI-SB 等
|
||||
- **自建中转**:使用 one-api、new-api 等搭建的中转服务
|
||||
- **第三方聚合平台**:集成多个 AI 模型的聚合服务
|
||||
- **企业内部服务**:公司内部部署的 AI 服务
|
||||
|
||||
## 适用场景
|
||||
|
||||
✅ 使用 API 中转服务(如 OneAPI)
|
||||
✅ 自建 AI 服务中转
|
||||
✅ 使用第三方聚合平台
|
||||
✅ 企业内部 AI 服务
|
||||
✅ 需要自定义 API 端点的场景
|
||||
|
||||
## 配置步骤
|
||||
|
||||
### 1. 获取服务信息
|
||||
|
||||
从你的 AI 服务提供商获取以下信息:
|
||||
|
||||
- **API 地址**:服务的 API 端点(必须兼容 OpenAI 格式)
|
||||
- **API 密钥**:用于身份验证的密钥
|
||||
- **模型名称**:可用的模型列表
|
||||
|
||||
### 2. 在密语中配置
|
||||
|
||||
1. 打开密语设置页面
|
||||
2. 切换到「AI 摘要」标签
|
||||
3. 选择「自定义(OpenAI 兼容)」提供商
|
||||
4. 填写配置信息:
|
||||
- **API 密钥**:输入你的 API 密钥
|
||||
- **服务地址**:输入完整的 API 地址(必须包含 `/v1`)
|
||||
- **选择模型**:从下拉列表选择或手动输入模型名称
|
||||
|
||||
5. 点击「测试连接」验证配置
|
||||
6. 测试成功后即可使用
|
||||
|
||||
## 服务地址格式
|
||||
|
||||
服务地址必须是完整的 API 端点,格式如下:
|
||||
|
||||
```
|
||||
https://your-api-domain.com/v1
|
||||
```
|
||||
|
||||
### 常见服务地址示例
|
||||
|
||||
#### OneAPI
|
||||
```
|
||||
https://your-oneapi-domain.com/v1
|
||||
```
|
||||
|
||||
#### API2D
|
||||
```
|
||||
https://openai.api2d.net/v1
|
||||
```
|
||||
|
||||
#### 自建 one-api
|
||||
```
|
||||
http://localhost:3000/v1
|
||||
或
|
||||
https://your-domain.com/v1
|
||||
```
|
||||
|
||||
#### OpenAI-SB
|
||||
```
|
||||
https://api.openai-sb.com/v1
|
||||
```
|
||||
|
||||
## 模型配置
|
||||
|
||||
### 预设模型
|
||||
|
||||
系统预设了常用模型:
|
||||
|
||||
- `gpt-4o`
|
||||
- `gpt-4o-mini`
|
||||
- `gpt-4-turbo`
|
||||
- `gpt-3.5-turbo`
|
||||
- `claude-3-5-sonnet-20241022`
|
||||
- `claude-3-5-haiku-20241022`
|
||||
- `gemini-2.0-flash-exp`
|
||||
- `deepseek-chat`
|
||||
- `qwen-plus`
|
||||
|
||||
### 自定义模型
|
||||
|
||||
如果你的服务提供其他模型,可以手动输入模型名称:
|
||||
|
||||
1. 点击「选择模型」下拉框
|
||||
2. 直接输入模型名称(例如:`gpt-4-1106-preview`)
|
||||
3. 系统会自动保存你输入的模型名称
|
||||
|
||||
## 常见问题
|
||||
|
||||
### Q: 提示"连接失败"
|
||||
|
||||
**A:** 请检查:
|
||||
1. 服务地址是否正确(必须包含 `/v1`)
|
||||
2. API 密钥是否有效
|
||||
3. 网络连接是否正常
|
||||
4. 是否需要开启代理(如果服务在国外)
|
||||
|
||||
### Q: 提示"API 端点不存在"
|
||||
|
||||
**A:** 服务地址格式不正确,确保:
|
||||
- 地址以 `/v1` 结尾
|
||||
- 使用 `https://` 或 `http://` 协议
|
||||
- 没有多余的路径(如 `/chat/completions`)
|
||||
|
||||
### Q: 提示"API Key 无效"
|
||||
|
||||
**A:** 请检查:
|
||||
1. API 密钥是否正确复制(没有多余空格)
|
||||
2. API 密钥是否已过期
|
||||
3. API 密钥是否有足够的权限
|
||||
|
||||
### Q: 如何知道服务是否兼容 OpenAI API?
|
||||
|
||||
**A:** 兼容 OpenAI API 的服务通常会:
|
||||
- 提供 `/v1/chat/completions` 端点
|
||||
- 支持 OpenAI 的请求格式
|
||||
- 在文档中明确说明"兼容 OpenAI API"
|
||||
|
||||
### Q: 可以使用本地部署的服务吗?
|
||||
|
||||
**A:** 可以!只要服务兼容 OpenAI API 格式,就可以使用。例如:
|
||||
```
|
||||
http://localhost:8000/v1
|
||||
http://192.168.1.100:3000/v1
|
||||
```
|
||||
|
||||
### Q: 支持哪些模型?
|
||||
|
||||
**A:** 理论上支持所有兼容 OpenAI API 的模型,包括:
|
||||
- OpenAI 系列(GPT-4、GPT-3.5 等)
|
||||
- Claude 系列(通过中转)
|
||||
- Gemini 系列(通过中转)
|
||||
- 国产大模型(通义千问、文心一言等,通过中转)
|
||||
- 开源模型(通过 vLLM、Ollama 等部署)
|
||||
|
||||
## 推荐服务
|
||||
|
||||
### OneAPI(推荐)
|
||||
|
||||
- **官网**:https://github.com/songquanpeng/one-api
|
||||
- **特点**:开源、免费、支持多种模型
|
||||
- **部署**:可自行部署或使用公共实例
|
||||
|
||||
### API2D
|
||||
|
||||
- **官网**:https://api2d.com/
|
||||
- **特点**:稳定、快速、支持多种模型
|
||||
- **价格**:按量计费
|
||||
|
||||
### OpenAI-SB
|
||||
|
||||
- **官网**:https://openai-sb.com/
|
||||
- **特点**:国内可用、稳定
|
||||
- **价格**:按量计费
|
||||
|
||||
## 安全提示
|
||||
|
||||
⚠️ **注意事项**:
|
||||
|
||||
1. **API 密钥安全**:不要将 API 密钥分享给他人
|
||||
2. **服务可信度**:使用可信的服务提供商
|
||||
3. **数据隐私**:了解服务商的数据处理政策
|
||||
4. **费用控制**:注意 API 使用量,避免超额消费
|
||||
|
||||
## 优势与劣势
|
||||
|
||||
### ✅ 优势
|
||||
|
||||
- 灵活性高,可使用任何兼容服务
|
||||
- 可以使用自建服务,完全掌控
|
||||
- 支持多种模型切换
|
||||
- 可以使用更便宜的中转服务
|
||||
|
||||
### ❌ 劣势
|
||||
|
||||
- 需要自行寻找可靠的服务商
|
||||
- 配置相对复杂
|
||||
- 服务质量取决于提供商
|
||||
- 可能需要额外的网络配置
|
||||
|
||||
## 技术支持
|
||||
|
||||
如果遇到问题,可以:
|
||||
|
||||
1. 检查服务商的文档
|
||||
2. 确认服务是否兼容 OpenAI API
|
||||
3. 使用「测试连接」功能诊断问题
|
||||
4. 查看错误提示信息
|
||||
|
||||
## 相关链接
|
||||
|
||||
- [OpenAI API 文档](https://platform.openai.com/docs/api-reference)
|
||||
- [OneAPI 项目](https://github.com/songquanpeng/one-api)
|
||||
- [New API 项目](https://github.com/Calcium-Ion/new-api)
|
||||
@@ -287,18 +287,18 @@ export class ConfigService {
|
||||
this.set('aiCurrentProvider', provider)
|
||||
}
|
||||
|
||||
getAIProviderConfig(providerId: string): { apiKey: string; model: string } | null {
|
||||
getAIProviderConfig(providerId: string): { apiKey: string; model: string; baseURL?: string } | null {
|
||||
const configs = this.get('aiProviderConfigs')
|
||||
return configs[providerId] || null
|
||||
}
|
||||
|
||||
setAIProviderConfig(providerId: string, config: { apiKey: string; model: string }): void {
|
||||
setAIProviderConfig(providerId: string, config: { apiKey: string; model: string; baseURL?: string }): void {
|
||||
const configs = this.get('aiProviderConfigs')
|
||||
configs[providerId] = config
|
||||
this.set('aiProviderConfigs', configs)
|
||||
}
|
||||
|
||||
getAllAIProviderConfigs(): { [providerId: string]: { apiKey: string; model: string } } {
|
||||
getAllAIProviderConfigs(): { [providerId: string]: { apiKey: string; model: string; baseURL?: string } } {
|
||||
return this.get('aiProviderConfigs')
|
||||
}
|
||||
}
|
||||
|
||||
@@ -276,7 +276,7 @@ export class WxKeyService {
|
||||
const keyBuffer = Buffer.alloc(65)
|
||||
if (PollKeyData(keyBuffer, 65)) {
|
||||
const key = keyBuffer.toString('utf8').replace(/\0/g, '').trim()
|
||||
console.log('轮询到密钥:', key, '长度:', key.length)
|
||||
|
||||
if (key && this.onKeyReceived) {
|
||||
this.onKeyReceived(key)
|
||||
}
|
||||
|
||||
@@ -0,0 +1,8 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24" fill="none">
|
||||
<rect x="3" y="3" width="18" height="18" rx="4" stroke="currentColor" stroke-width="2" fill="none"/>
|
||||
<path d="M8 12h8M12 8v8" stroke="currentColor" stroke-width="2" stroke-linecap="round"/>
|
||||
<circle cx="8" cy="8" r="1.5" fill="currentColor"/>
|
||||
<circle cx="16" cy="8" r="1.5" fill="currentColor"/>
|
||||
<circle cx="8" cy="16" r="1.5" fill="currentColor"/>
|
||||
<circle cx="16" cy="16" r="1.5" fill="currentColor"/>
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 494 B |
@@ -0,0 +1 @@
|
||||
<svg fill="currentColor" fill-rule="evenodd" height="1em" style="flex:none;line-height:1" viewBox="0 0 24 24" width="1em" xmlns="http://www.w3.org/2000/svg"><title>Ollama</title><path d="M7.905 1.09c.216.085.411.225.588.41.295.306.544.744.734 1.263.191.522.315 1.1.362 1.68a5.054 5.054 0 012.049-.636l.051-.004c.87-.07 1.73.087 2.48.474.101.053.2.11.297.17.05-.569.172-1.134.36-1.644.19-.52.439-.957.733-1.264a1.67 1.67 0 01.589-.41c.257-.1.53-.118.796-.042.401.114.745.368 1.016.737.248.337.434.769.561 1.287.23.934.27 2.163.115 3.645l.053.04.026.019c.757.576 1.284 1.397 1.563 2.35.435 1.487.216 3.155-.534 4.088l-.018.021.002.003c.417.762.67 1.567.724 2.4l.002.03c.064 1.065-.2 2.137-.814 3.19l-.007.01.01.024c.472 1.157.62 2.322.438 3.486l-.006.039a.651.651 0 01-.747.536.648.648 0 01-.54-.742c.167-1.033.01-2.069-.48-3.123a.643.643 0 01.04-.617l.004-.006c.604-.924.854-1.83.8-2.72-.046-.779-.325-1.544-.8-2.273a.644.644 0 01.18-.886l.009-.006c.243-.159.467-.565.58-1.12a4.229 4.229 0 00-.095-1.974c-.205-.7-.58-1.284-1.105-1.683-.595-.454-1.383-.673-2.38-.61a.653.653 0 01-.632-.371c-.314-.665-.772-1.141-1.343-1.436a3.288 3.288 0 00-1.772-.332c-1.245.099-2.343.801-2.67 1.686a.652.652 0 01-.61.425c-1.067.002-1.893.252-2.497.703-.522.39-.878.935-1.066 1.588a4.07 4.07 0 00-.068 1.886c.112.558.331 1.02.582 1.269l.008.007c.212.207.257.53.109.785-.36.622-.629 1.549-.673 2.44-.05 1.018.186 1.902.719 2.536l.016.019a.643.643 0 01.095.69c-.576 1.236-.753 2.252-.562 3.052a.652.652 0 01-1.269.298c-.243-1.018-.078-2.184.473-3.498l.014-.035-.008-.012a4.339 4.339 0 01-.598-1.309l-.005-.019a5.764 5.764 0 01-.177-1.785c.044-.91.278-1.842.622-2.59l.012-.026-.002-.002c-.293-.418-.51-.953-.63-1.545l-.005-.024a5.352 5.352 0 01.093-2.49c.262-.915.777-1.701 1.536-2.269.06-.045.123-.09.186-.132-.159-1.493-.119-2.73.112-3.67.127-.518.314-.95.562-1.287.27-.368.614-.622 1.015-.737.266-.076.54-.059.797.042zm4.116 9.09c.936 0 1.8.313 2.446.855.63.527 1.005 1.235 1.005 1.94 0 .888-.406 1.58-1.133 2.022-.62.375-1.451.557-2.403.557-1.009 0-1.871-.259-2.493-.734-.617-.47-.963-1.13-.963-1.845 0-.707.398-1.417 1.056-1.946.668-.537 1.55-.849 2.485-.849zm0 .896a3.07 3.07 0 00-1.916.65c-.461.37-.722.835-.722 1.25 0 .428.21.829.61 1.134.455.347 1.124.548 1.943.548.799 0 1.473-.147 1.932-.426.463-.28.7-.686.7-1.257 0-.423-.246-.89-.683-1.256-.484-.405-1.14-.643-1.864-.643zm.662 1.21l.004.004c.12.151.095.37-.056.49l-.292.23v.446a.375.375 0 01-.376.373.375.375 0 01-.376-.373v-.46l-.271-.218a.347.347 0 01-.052-.49.353.353 0 01.494-.051l.215.172.22-.174a.353.353 0 01.49.051zm-5.04-1.919c.478 0 .867.39.867.871a.87.87 0 01-.868.871.87.87 0 01-.867-.87.87.87 0 01.867-.872zm8.706 0c.48 0 .868.39.868.871a.87.87 0 01-.868.871.87.87 0 01-.867-.87.87.87 0 01.867-.872zM7.44 2.3l-.003.002a.659.659 0 00-.285.238l-.005.006c-.138.189-.258.467-.348.832-.17.692-.216 1.631-.124 2.782.43-.128.899-.208 1.404-.237l.01-.001.019-.034c.046-.082.095-.161.148-.239.123-.771.022-1.692-.253-2.444-.134-.364-.297-.65-.453-.813a.628.628 0 00-.107-.09L7.44 2.3zm9.174.04l-.002.001a.628.628 0 00-.107.09c-.156.163-.32.45-.453.814-.29.794-.387 1.776-.23 2.572l.058.097.008.014h.03a5.184 5.184 0 011.466.212c.086-1.124.038-2.043-.128-2.722-.09-.365-.21-.643-.349-.832l-.004-.006a.659.659 0 00-.285-.239h-.004z"></path></svg>
|
||||
|
After Width: | Height: | Size: 3.2 KiB |
Binary file not shown.
Binary file not shown.
@@ -185,6 +185,12 @@
|
||||
font-weight: 500;
|
||||
color: var(--text-primary);
|
||||
margin-bottom: 8px;
|
||||
|
||||
&.label-with-help {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 6px;
|
||||
}
|
||||
}
|
||||
|
||||
// 独立的小弹窗提示
|
||||
@@ -623,4 +629,524 @@
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
// Ollama 帮助弹窗样式
|
||||
.ollama-help-modal {
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: 0;
|
||||
right: 0;
|
||||
bottom: 0;
|
||||
background: rgba(0, 0, 0, 0.5);
|
||||
backdrop-filter: blur(4px);
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
z-index: 10000;
|
||||
animation: fadeIn 0.2s ease;
|
||||
padding: 20px;
|
||||
|
||||
.ollama-help-content {
|
||||
background: var(--bg-primary);
|
||||
border-radius: 20px;
|
||||
max-width: 800px;
|
||||
width: 100%;
|
||||
max-height: 90vh;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
box-shadow: 0 20px 60px rgba(0, 0, 0, 0.3);
|
||||
animation: slideInUp 0.3s cubic-bezier(0.16, 1, 0.3, 1);
|
||||
}
|
||||
|
||||
.ollama-help-header {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: space-between;
|
||||
padding: 24px 28px;
|
||||
border-bottom: 1px solid var(--border-color);
|
||||
|
||||
h2 {
|
||||
margin: 0;
|
||||
font-size: 20px;
|
||||
font-weight: 600;
|
||||
color: var(--text-primary);
|
||||
}
|
||||
|
||||
.close-btn {
|
||||
width: 36px;
|
||||
height: 36px;
|
||||
border-radius: 50%;
|
||||
display: grid;
|
||||
place-items: center;
|
||||
background: var(--bg-secondary);
|
||||
border: none;
|
||||
color: var(--text-secondary);
|
||||
cursor: pointer;
|
||||
transition: all 0.2s;
|
||||
|
||||
&:hover {
|
||||
background: var(--bg-tertiary);
|
||||
color: var(--text-primary);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
.ollama-help-body {
|
||||
padding: 28px;
|
||||
overflow-y: auto;
|
||||
flex: 1;
|
||||
|
||||
&::-webkit-scrollbar {
|
||||
width: 6px;
|
||||
}
|
||||
|
||||
&::-webkit-scrollbar-thumb {
|
||||
background: var(--border-color);
|
||||
border-radius: 3px;
|
||||
}
|
||||
|
||||
// Markdown 内容样式
|
||||
&.markdown-content {
|
||||
h1, h2, h3, h4, h5, h6 {
|
||||
color: var(--text-primary);
|
||||
margin: 24px 0 12px 0;
|
||||
font-weight: 600;
|
||||
|
||||
&:first-child {
|
||||
margin-top: 0;
|
||||
}
|
||||
}
|
||||
|
||||
h1 {
|
||||
font-size: 24px;
|
||||
border-bottom: 2px solid var(--border-color);
|
||||
padding-bottom: 8px;
|
||||
}
|
||||
|
||||
h2 {
|
||||
font-size: 20px;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 8px;
|
||||
|
||||
&::before {
|
||||
content: '';
|
||||
width: 4px;
|
||||
height: 20px;
|
||||
background: var(--primary);
|
||||
border-radius: 2px;
|
||||
}
|
||||
}
|
||||
|
||||
h3 {
|
||||
font-size: 16px;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 8px;
|
||||
|
||||
&::before {
|
||||
content: '';
|
||||
width: 4px;
|
||||
height: 16px;
|
||||
background: var(--primary);
|
||||
border-radius: 2px;
|
||||
}
|
||||
}
|
||||
|
||||
h4 {
|
||||
font-size: 14px;
|
||||
}
|
||||
|
||||
p {
|
||||
font-size: 14px;
|
||||
line-height: 1.6;
|
||||
color: var(--text-secondary);
|
||||
margin: 0 0 12px 0;
|
||||
|
||||
&:last-child {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
}
|
||||
|
||||
ol, ul {
|
||||
margin: 0 0 12px 0;
|
||||
padding-left: 24px;
|
||||
|
||||
li {
|
||||
font-size: 14px;
|
||||
line-height: 1.8;
|
||||
color: var(--text-secondary);
|
||||
margin-bottom: 6px;
|
||||
|
||||
&:last-child {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
a {
|
||||
color: var(--primary);
|
||||
text-decoration: none;
|
||||
transition: opacity 0.2s;
|
||||
|
||||
&:hover {
|
||||
opacity: 0.8;
|
||||
text-decoration: underline;
|
||||
}
|
||||
}
|
||||
|
||||
code {
|
||||
background: var(--bg-secondary);
|
||||
padding: 2px 8px;
|
||||
border-radius: 6px;
|
||||
font-family: 'Consolas', monospace;
|
||||
font-size: 13px;
|
||||
color: var(--primary);
|
||||
border: 1px solid var(--border-color);
|
||||
}
|
||||
|
||||
pre {
|
||||
background: var(--bg-secondary);
|
||||
padding: 12px 16px;
|
||||
border-radius: 12px;
|
||||
border: 1px solid var(--border-color);
|
||||
overflow-x: auto;
|
||||
margin: 12px 0;
|
||||
|
||||
code {
|
||||
background: transparent;
|
||||
padding: 0;
|
||||
border: none;
|
||||
color: var(--text-secondary);
|
||||
}
|
||||
}
|
||||
|
||||
blockquote {
|
||||
border-left: 4px solid var(--primary);
|
||||
padding-left: 16px;
|
||||
margin: 12px 0;
|
||||
color: var(--text-secondary);
|
||||
font-style: italic;
|
||||
}
|
||||
|
||||
table {
|
||||
width: 100%;
|
||||
border-collapse: collapse;
|
||||
margin: 12px 0;
|
||||
|
||||
th, td {
|
||||
border: 1px solid var(--border-color);
|
||||
padding: 8px 12px;
|
||||
text-align: left;
|
||||
}
|
||||
|
||||
th {
|
||||
background: var(--bg-secondary);
|
||||
font-weight: 600;
|
||||
color: var(--text-primary);
|
||||
}
|
||||
|
||||
td {
|
||||
color: var(--text-secondary);
|
||||
}
|
||||
}
|
||||
|
||||
hr {
|
||||
border: none;
|
||||
border-top: 1px solid var(--border-color);
|
||||
margin: 24px 0;
|
||||
}
|
||||
|
||||
strong {
|
||||
font-weight: 600;
|
||||
color: var(--text-primary);
|
||||
}
|
||||
|
||||
em {
|
||||
font-style: italic;
|
||||
}
|
||||
}
|
||||
|
||||
section {
|
||||
margin-bottom: 28px;
|
||||
|
||||
&:last-child {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
h3 {
|
||||
font-size: 16px;
|
||||
font-weight: 600;
|
||||
color: var(--text-primary);
|
||||
margin: 0 0 12px 0;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 8px;
|
||||
|
||||
&::before {
|
||||
content: '';
|
||||
width: 4px;
|
||||
height: 16px;
|
||||
background: var(--primary);
|
||||
border-radius: 2px;
|
||||
}
|
||||
}
|
||||
|
||||
h4 {
|
||||
font-size: 14px;
|
||||
font-weight: 600;
|
||||
color: var(--text-primary);
|
||||
margin: 0 0 8px 0;
|
||||
}
|
||||
|
||||
p {
|
||||
font-size: 14px;
|
||||
line-height: 1.6;
|
||||
color: var(--text-secondary);
|
||||
margin: 0 0 12px 0;
|
||||
|
||||
&:last-child {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
}
|
||||
|
||||
ol, ul {
|
||||
margin: 0 0 12px 0;
|
||||
padding-left: 24px;
|
||||
|
||||
li {
|
||||
font-size: 14px;
|
||||
line-height: 1.8;
|
||||
color: var(--text-secondary);
|
||||
margin-bottom: 6px;
|
||||
|
||||
&:last-child {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
a {
|
||||
color: var(--primary);
|
||||
text-decoration: none;
|
||||
transition: opacity 0.2s;
|
||||
|
||||
&:hover {
|
||||
opacity: 0.8;
|
||||
text-decoration: underline;
|
||||
}
|
||||
}
|
||||
|
||||
code {
|
||||
background: var(--bg-secondary);
|
||||
padding: 2px 8px;
|
||||
border-radius: 6px;
|
||||
font-family: 'Consolas', monospace;
|
||||
font-size: 13px;
|
||||
color: var(--primary);
|
||||
border: 1px solid var(--border-color);
|
||||
}
|
||||
|
||||
.code-block {
|
||||
background: var(--bg-secondary);
|
||||
padding: 12px 16px;
|
||||
border-radius: 12px;
|
||||
border: 1px solid var(--border-color);
|
||||
font-size: 13px;
|
||||
color: var(--text-secondary);
|
||||
margin: 12px 0;
|
||||
|
||||
code {
|
||||
background: transparent;
|
||||
padding: 0;
|
||||
border: none;
|
||||
}
|
||||
}
|
||||
|
||||
.model-list {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 12px;
|
||||
margin-top: 12px;
|
||||
}
|
||||
|
||||
.model-item {
|
||||
background: var(--bg-secondary);
|
||||
padding: 12px 16px;
|
||||
border-radius: 12px;
|
||||
border: 1px solid var(--border-color);
|
||||
|
||||
strong {
|
||||
display: block;
|
||||
font-size: 14px;
|
||||
color: var(--text-primary);
|
||||
margin-bottom: 6px;
|
||||
}
|
||||
|
||||
code {
|
||||
display: block;
|
||||
background: var(--bg-tertiary);
|
||||
padding: 8px 12px;
|
||||
border-radius: 8px;
|
||||
font-size: 13px;
|
||||
margin-top: 6px;
|
||||
}
|
||||
}
|
||||
|
||||
.faq-item {
|
||||
background: var(--bg-secondary);
|
||||
padding: 16px;
|
||||
border-radius: 12px;
|
||||
border: 1px solid var(--border-color);
|
||||
margin-bottom: 12px;
|
||||
|
||||
&:last-child {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
strong {
|
||||
display: block;
|
||||
font-size: 14px;
|
||||
color: var(--text-primary);
|
||||
margin-bottom: 8px;
|
||||
}
|
||||
|
||||
p {
|
||||
margin: 0;
|
||||
font-size: 13px;
|
||||
line-height: 1.6;
|
||||
}
|
||||
|
||||
code {
|
||||
font-size: 12px;
|
||||
}
|
||||
}
|
||||
|
||||
.pros-cons {
|
||||
display: grid;
|
||||
grid-template-columns: 1fr 1fr;
|
||||
gap: 16px;
|
||||
margin-top: 12px;
|
||||
|
||||
.pros, .cons {
|
||||
background: var(--bg-secondary);
|
||||
padding: 16px;
|
||||
border-radius: 12px;
|
||||
border: 1px solid var(--border-color);
|
||||
|
||||
h4 {
|
||||
margin-bottom: 12px;
|
||||
}
|
||||
|
||||
ul {
|
||||
margin: 0;
|
||||
padding-left: 20px;
|
||||
|
||||
li {
|
||||
font-size: 13px;
|
||||
line-height: 1.8;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
&.links-section {
|
||||
ul {
|
||||
list-style: none;
|
||||
padding: 0;
|
||||
margin: 12px 0 0 0;
|
||||
|
||||
li {
|
||||
margin-bottom: 8px;
|
||||
|
||||
&:last-child {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
a {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
gap: 6px;
|
||||
font-size: 14px;
|
||||
padding: 8px 12px;
|
||||
background: var(--bg-secondary);
|
||||
border-radius: 8px;
|
||||
border: 1px solid var(--border-color);
|
||||
transition: all 0.2s;
|
||||
|
||||
&:hover {
|
||||
background: var(--bg-tertiary);
|
||||
border-color: var(--primary);
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
&::before {
|
||||
content: '🔗';
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 帮助图标按钮样式
|
||||
.help-icon-btn {
|
||||
width: 20px;
|
||||
height: 20px;
|
||||
border-radius: 50%;
|
||||
display: grid;
|
||||
place-items: center;
|
||||
background: transparent;
|
||||
border: none;
|
||||
color: var(--text-tertiary);
|
||||
cursor: pointer;
|
||||
transition: all 0.2s;
|
||||
padding: 0;
|
||||
flex-shrink: 0;
|
||||
|
||||
&:hover {
|
||||
color: var(--primary);
|
||||
background: color-mix(in srgb, var(--primary), transparent 90%);
|
||||
}
|
||||
}
|
||||
|
||||
// 表单提示文字
|
||||
.form-hint {
|
||||
font-size: 12px;
|
||||
color: var(--text-tertiary);
|
||||
margin-top: 6px;
|
||||
line-height: 1.5;
|
||||
}
|
||||
|
||||
// 移动端适配
|
||||
@media (max-width: 768px) {
|
||||
.ollama-help-modal {
|
||||
padding: 10px;
|
||||
|
||||
.ollama-help-content {
|
||||
max-height: 95vh;
|
||||
}
|
||||
|
||||
.ollama-help-header {
|
||||
padding: 20px;
|
||||
|
||||
h2 {
|
||||
font-size: 18px;
|
||||
}
|
||||
}
|
||||
|
||||
.ollama-help-body {
|
||||
padding: 20px;
|
||||
|
||||
section {
|
||||
.pros-cons {
|
||||
grid-template-columns: 1fr;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
import { useState, useEffect, useRef } from 'react'
|
||||
import { Eye, EyeOff, Sparkles, Check, ChevronDown, ChevronUp, Zap, Star, FileText } from 'lucide-react'
|
||||
import { Eye, EyeOff, Sparkles, Check, ChevronDown, ChevronUp, Zap, Star, FileText, HelpCircle, X } from 'lucide-react'
|
||||
import { getAIProviders, type AIProviderInfo } from '../../types/ai'
|
||||
import { marked } from 'marked'
|
||||
import DOMPurify from 'dompurify'
|
||||
import './AISummarySettings.scss'
|
||||
|
||||
interface CustomSelectProps {
|
||||
@@ -125,7 +127,13 @@ function AISummarySettings({
|
||||
const [isTesting, setIsTesting] = useState(false)
|
||||
const [usageStats, setUsageStats] = useState<any>(null)
|
||||
const [providers, setProviders] = useState<AIProviderInfo[]>([])
|
||||
const [providerConfigs, setProviderConfigs] = useState<{ [key: string]: { apiKey: string; model: string } }>({})
|
||||
const [providerConfigs, setProviderConfigs] = useState<{ [key: string]: { apiKey: string; model: string; baseURL?: string } }>({})
|
||||
const [baseURL, setBaseURL] = useState('')
|
||||
const [showOllamaHelp, setShowOllamaHelp] = useState(false)
|
||||
const [showCustomHelp, setShowCustomHelp] = useState(false)
|
||||
const [ollamaGuideContent, setOllamaGuideContent] = useState('')
|
||||
const [customGuideContent, setCustomGuideContent] = useState('')
|
||||
const [isLoadingGuide, setIsLoadingGuide] = useState(false)
|
||||
|
||||
useEffect(() => {
|
||||
// 加载提供商列表和统计数据
|
||||
@@ -134,6 +142,37 @@ function AISummarySettings({
|
||||
loadAllProviderConfigs()
|
||||
}, [])
|
||||
|
||||
// 当 provider 改变时,加载对应的 baseURL
|
||||
useEffect(() => {
|
||||
const loadBaseURL = async () => {
|
||||
if (provider === 'ollama' || provider === 'custom') {
|
||||
const { getAiProviderConfig } = await import('../../services/config')
|
||||
const config = await getAiProviderConfig(provider)
|
||||
if (provider === 'ollama') {
|
||||
setBaseURL(config?.baseURL || 'http://localhost:11434/v1')
|
||||
} else if (provider === 'custom') {
|
||||
setBaseURL(config?.baseURL || '')
|
||||
}
|
||||
} else {
|
||||
setBaseURL('')
|
||||
}
|
||||
}
|
||||
loadBaseURL()
|
||||
}, [provider])
|
||||
|
||||
// 当 baseURL 改变时,自动保存(仅针对 Ollama 和 Custom)
|
||||
useEffect(() => {
|
||||
const saveBaseURL = async () => {
|
||||
if ((provider === 'ollama' || provider === 'custom') && baseURL) {
|
||||
const { setAiProviderConfig } = await import('../../services/config')
|
||||
await setAiProviderConfig(provider, { apiKey, model, baseURL })
|
||||
}
|
||||
}
|
||||
// 延迟保存,避免初始化时触发
|
||||
const timer = setTimeout(saveBaseURL, 500)
|
||||
return () => clearTimeout(timer)
|
||||
}, [baseURL, provider, apiKey, model])
|
||||
|
||||
const loadProviders = async () => {
|
||||
try {
|
||||
const providerList = await getAIProviders()
|
||||
@@ -155,12 +194,12 @@ function AISummarySettings({
|
||||
|
||||
const handleProviderChange = async (newProvider: string) => {
|
||||
// 先保存当前提供商的配置
|
||||
if (provider && (apiKey || model)) {
|
||||
if (provider && (apiKey || model || baseURL)) {
|
||||
const { setAiProviderConfig } = await import('../../services/config')
|
||||
await setAiProviderConfig(provider, { apiKey, model })
|
||||
await setAiProviderConfig(provider, { apiKey, model, baseURL: baseURL || undefined })
|
||||
setProviderConfigs(prev => ({
|
||||
...prev,
|
||||
[provider]: { apiKey, model }
|
||||
[provider]: { apiKey, model, baseURL: baseURL || undefined }
|
||||
}))
|
||||
}
|
||||
|
||||
@@ -175,10 +214,19 @@ function AISummarySettings({
|
||||
// 使用已保存的配置
|
||||
setApiKey(savedConfig.apiKey)
|
||||
setModel(savedConfig.model)
|
||||
setBaseURL(savedConfig.baseURL || '')
|
||||
} else if (newProviderData) {
|
||||
// 使用默认配置
|
||||
setApiKey('')
|
||||
setModel(newProviderData.models[0])
|
||||
// Ollama 和 Custom 的默认 baseURL
|
||||
if (newProvider === 'ollama') {
|
||||
setBaseURL('http://localhost:11434/v1')
|
||||
} else if (newProvider === 'custom') {
|
||||
setBaseURL('')
|
||||
} else {
|
||||
setBaseURL('')
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -225,6 +273,45 @@ function AISummarySettings({
|
||||
}
|
||||
}
|
||||
|
||||
// 加载使用指南
|
||||
const loadGuide = async (guideName: string) => {
|
||||
setIsLoadingGuide(true)
|
||||
try {
|
||||
const result = await window.electronAPI.ai.readGuide(guideName)
|
||||
if (result.success && result.content) {
|
||||
const html = await marked.parse(result.content)
|
||||
const sanitized = DOMPurify.sanitize(html)
|
||||
return sanitized
|
||||
} else {
|
||||
console.error('加载指南失败:', result.error)
|
||||
return '<p>加载指南失败</p>'
|
||||
}
|
||||
} catch (e) {
|
||||
console.error('加载指南异常:', e)
|
||||
return '<p>加载指南失败</p>'
|
||||
} finally {
|
||||
setIsLoadingGuide(false)
|
||||
}
|
||||
}
|
||||
|
||||
// 打开 Ollama 帮助
|
||||
const handleOpenOllamaHelp = async () => {
|
||||
if (!ollamaGuideContent) {
|
||||
const content = await loadGuide('Ollama使用指南.md')
|
||||
setOllamaGuideContent(content)
|
||||
}
|
||||
setShowOllamaHelp(true)
|
||||
}
|
||||
|
||||
// 打开自定义服务帮助
|
||||
const handleOpenCustomHelp = async () => {
|
||||
if (!customGuideContent) {
|
||||
const content = await loadGuide('自定义AI服务使用指南.md')
|
||||
setCustomGuideContent(content)
|
||||
}
|
||||
setShowCustomHelp(true)
|
||||
}
|
||||
|
||||
const currentProvider = providers.find(p => p.id === provider) || providers[0]
|
||||
const modelOptions = currentProvider?.models.map(m => ({ value: m, label: m })) || []
|
||||
const timeRangeOptions = [
|
||||
@@ -272,7 +359,13 @@ function AISummarySettings({
|
||||
<div className="input-with-actions">
|
||||
<input
|
||||
type={showApiKey ? 'text' : 'password'}
|
||||
placeholder={`请输入 ${currentProvider?.displayName} API 密钥`}
|
||||
placeholder={
|
||||
provider === 'ollama'
|
||||
? '本地服务无需密钥(可选)'
|
||||
: provider === 'custom'
|
||||
? '请输入自定义服务的 API 密钥'
|
||||
: `请输入 ${currentProvider?.displayName} API 密钥`
|
||||
}
|
||||
value={apiKey}
|
||||
onChange={(e) => setApiKey(e.target.value)}
|
||||
className="api-key-input"
|
||||
@@ -289,7 +382,7 @@ function AISummarySettings({
|
||||
type="button"
|
||||
className="input-action-btn primary"
|
||||
onClick={handleTestConnection}
|
||||
disabled={isTesting || !apiKey}
|
||||
disabled={isTesting || (provider !== 'ollama' && !apiKey) || (provider === 'custom' && !baseURL)}
|
||||
title="测试连接"
|
||||
>
|
||||
{isTesting ? <Sparkles size={16} className="spin" /> : <Sparkles size={16} />}
|
||||
@@ -297,6 +390,61 @@ function AISummarySettings({
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Ollama 专用:baseURL 配置 */}
|
||||
{provider === 'ollama' && (
|
||||
<div className="form-group">
|
||||
<label className="label-with-help">
|
||||
<span>服务地址</span>
|
||||
<button
|
||||
type="button"
|
||||
className="help-icon-btn"
|
||||
onClick={handleOpenOllamaHelp}
|
||||
title="查看 Ollama 使用指南"
|
||||
>
|
||||
<HelpCircle size={16} />
|
||||
</button>
|
||||
</label>
|
||||
<input
|
||||
type="text"
|
||||
placeholder="http://localhost:11434/v1"
|
||||
value={baseURL}
|
||||
onChange={(e) => setBaseURL(e.target.value)}
|
||||
className="api-key-input"
|
||||
/>
|
||||
<div className="form-hint">
|
||||
Ollama 默认运行在 http://localhost:11434,如果修改了端口或使用远程服务,请在此配置
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Custom 专用:baseURL 配置 */}
|
||||
{provider === 'custom' && (
|
||||
<div className="form-group">
|
||||
<label className="label-with-help">
|
||||
<span>服务地址 *</span>
|
||||
<button
|
||||
type="button"
|
||||
className="help-icon-btn"
|
||||
onClick={handleOpenCustomHelp}
|
||||
title="查看自定义服务使用指南"
|
||||
>
|
||||
<HelpCircle size={16} />
|
||||
</button>
|
||||
</label>
|
||||
<input
|
||||
type="text"
|
||||
placeholder="https://api.example.com/v1"
|
||||
value={baseURL}
|
||||
onChange={(e) => setBaseURL(e.target.value)}
|
||||
className="api-key-input"
|
||||
required
|
||||
/>
|
||||
<div className="form-hint">
|
||||
请输入 OpenAI 兼容的 API 地址(需包含 /v1),例如:OneAPI、API2D、自建中转等
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<div className="form-row">
|
||||
<div className="form-group">
|
||||
<label>选择模型 (支持手动输入)</label>
|
||||
@@ -397,6 +545,42 @@ function AISummarySettings({
|
||||
<div className="info-box-simple">
|
||||
<p>💡 提示:API 密钥存储在本地,不会上传到任何服务器。摘要内容仅用于本地展示。</p>
|
||||
</div>
|
||||
|
||||
{/* Ollama 使用指南弹窗 */}
|
||||
{showOllamaHelp && (
|
||||
<div className="ollama-help-modal" onClick={() => setShowOllamaHelp(false)}>
|
||||
<div className="ollama-help-content" onClick={(e) => e.stopPropagation()}>
|
||||
<div className="ollama-help-header">
|
||||
<h2>Ollama 本地 AI 使用指南</h2>
|
||||
<button className="close-btn" onClick={() => setShowOllamaHelp(false)}>
|
||||
<X size={20} />
|
||||
</button>
|
||||
</div>
|
||||
<div
|
||||
className="ollama-help-body markdown-content"
|
||||
dangerouslySetInnerHTML={{ __html: ollamaGuideContent || '<p>加载中...</p>' }}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* 自定义服务使用指南弹窗 */}
|
||||
{showCustomHelp && (
|
||||
<div className="ollama-help-modal" onClick={() => setShowCustomHelp(false)}>
|
||||
<div className="ollama-help-content" onClick={(e) => e.stopPropagation()}>
|
||||
<div className="ollama-help-header">
|
||||
<h2>自定义 AI 服务使用指南</h2>
|
||||
<button className="close-btn" onClick={() => setShowCustomHelp(false)}>
|
||||
<X size={20} />
|
||||
</button>
|
||||
</div>
|
||||
<div
|
||||
className="ollama-help-body markdown-content"
|
||||
dangerouslySetInnerHTML={{ __html: customGuideContent || '<p>加载中...</p>' }}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
@@ -231,14 +231,14 @@ export async function setAiProvider(provider: string): Promise<void> {
|
||||
}
|
||||
|
||||
// 获取指定提供商的配置
|
||||
export async function getAiProviderConfig(providerId: string): Promise<{ apiKey: string; model: string } | null> {
|
||||
export async function getAiProviderConfig(providerId: string): Promise<{ apiKey: string; model: string; baseURL?: string } | null> {
|
||||
const configs = await config.get('aiProviderConfigs')
|
||||
const allConfigs = (configs as any) || {}
|
||||
return allConfigs[providerId] || null
|
||||
}
|
||||
|
||||
// 设置指定提供商的配置
|
||||
export async function setAiProviderConfig(providerId: string, providerConfig: { apiKey: string; model: string }): Promise<void> {
|
||||
export async function setAiProviderConfig(providerId: string, providerConfig: { apiKey: string; model: string; baseURL?: string }): Promise<void> {
|
||||
const configs = await config.get('aiProviderConfigs')
|
||||
const allConfigs = (configs as any) || {}
|
||||
allConfigs[providerId] = providerConfig
|
||||
@@ -246,7 +246,7 @@ export async function setAiProviderConfig(providerId: string, providerConfig: {
|
||||
}
|
||||
|
||||
// 获取所有提供商的配置
|
||||
export async function getAllAiProviderConfigs(): Promise<{ [providerId: string]: { apiKey: string; model: string } }> {
|
||||
export async function getAllAiProviderConfigs(): Promise<{ [providerId: string]: { apiKey: string; model: string; baseURL?: string } }> {
|
||||
const value = await config.get('aiProviderConfigs')
|
||||
return (value as any) || {}
|
||||
}
|
||||
|
||||
Vendored
+5
@@ -553,6 +553,11 @@ export interface ElectronAPI {
|
||||
success: boolean
|
||||
error?: string
|
||||
}>
|
||||
readGuide: (guideName: string) => Promise<{
|
||||
success: boolean
|
||||
content?: string
|
||||
error?: string
|
||||
}>
|
||||
generateSummary: (sessionId: string, timeRange: number, options: {
|
||||
provider: string
|
||||
apiKey: string
|
||||
|
||||
Reference in New Issue
Block a user