mirror of
https://gitee.com/houhuan/TrendRadar.git
synced 2025-12-21 14:17:16 +08:00
chore: 文档优化,增加 ai 分析所需的本地新闻测试数据
This commit is contained in:
parent
9cf3fccacd
commit
5877cb623e
325
README-EN.md
325
README-EN.md
@ -47,7 +47,7 @@
|
||||
| [🎯 Core Features](#-core-features) | [🚀 Quick Start](#-quick-start) | [🐳 Docker Deployment](#-docker-deployment) | [🤖 AI Analysis](#-ai-analysis-deployment) |
|
||||
|:---:|:---:|:---:|:---:|
|
||||
| [📝 Changelog](#-changelog) | [🔌 MCP Clients](#-mcp-clients) | [❓ FAQ & Support](#-faq--support) | [⭐ Related Projects](#-related-projects) |
|
||||
| [🔧 Custom Platforms](#custom-monitoring-platforms) | [📝 Keywords Config](#frequencywordstxt-configuration) | | |
|
||||
| [🔧 Custom Platforms](#custom-monitoring-platforms) | [📝 Keywords Config](#frequencywordstxt-configuration) | [🪄 Sponsors](#-sponsors) | |
|
||||
|
||||
</div>
|
||||
|
||||
@ -102,8 +102,6 @@ This project uses the API from [newsnow](https://github.com/ourongxing/newsnow)
|
||||
</details>
|
||||
|
||||
|
||||
> This project uses the API from [newsnow](https://github.com/ourongxing/newsnow) to fetch multi-platform data
|
||||
|
||||
## ✨ Core Features
|
||||
|
||||
### **Multi-Platform Trending News Aggregation**
|
||||
@ -474,7 +472,11 @@ AI conversational analysis system based on MCP (Model Context Protocol), enablin
|
||||
- Cross-platform data comparison (activity stats, keyword co-occurrence)
|
||||
- Smart summary generation, similar news finding, historical correlation search
|
||||
|
||||
> No more manual data file browsing—AI assistant helps you understand the stories behind the news in seconds
|
||||
> **💡 Usage Tip**: AI features require local news data support
|
||||
> - Project includes **November 1-15** test data for immediate experience
|
||||
> - Recommend deploying the project yourself to get more real-time data
|
||||
>
|
||||
> See [AI Analysis Deployment](#-ai-analysis-deployment) for details
|
||||
|
||||
### **Zero Technical Barrier Deployment**
|
||||
|
||||
@ -904,7 +906,41 @@ frequency_words.txt file added **required word** feature, using + sign
|
||||
|
||||
<br>
|
||||
|
||||
**Method 2:** (See Chinese version for detailed steps)
|
||||
**Method 2:**
|
||||
|
||||
1. Open in PC browser https://botbuilder.feishu.cn/home/my-app
|
||||
|
||||
2. Click "New Bot Application"
|
||||
|
||||
3. After entering the created application, click "Process Design" > "Create Process" > "Select Trigger"
|
||||
|
||||
4. Scroll down, click "Webhook Trigger"
|
||||
|
||||
5. Now you'll see "Webhook Address", copy this link to local notepad temporarily, continue with next steps
|
||||
|
||||
6. In "Parameters" put the following content, then click "Done"
|
||||
|
||||
```json
|
||||
{
|
||||
"message_type": "text",
|
||||
"content": {
|
||||
"total_titles": "{{Content}}",
|
||||
"timestamp": "{{Content}}",
|
||||
"report_type": "{{Content}}",
|
||||
"text": "{{Content}}"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
7. Click "Select Action" > "Send Feishu Message", check "Group Message", then click the input box below, click "Groups I Manage" (if no group, you can create one in Feishu app)
|
||||
|
||||
8. Message title fill "TrendRadar Trending Monitor"
|
||||
|
||||
9. Most critical part, click + button, select "Webhook Trigger", then arrange as shown in image
|
||||
|
||||

|
||||
|
||||
10. After configuration, put Webhook address from step 5 into GitHub Secrets `FEISHU_WEBHOOK_URL`
|
||||
|
||||
</details>
|
||||
|
||||
@ -1399,7 +1435,31 @@ docker exec -it trend-radar ls -la /app/config/
|
||||
|
||||
## 🤖 AI Analysis Deployment
|
||||
|
||||
TrendRadar v3.0.0 added **MCP (Model Context Protocol)** based AI analysis feature, allowing natural language conversations with news data for deep analysis. Best prerequisite for using **AI features** is running this project for at least one day (accumulate news data).
|
||||
TrendRadar v3.0.0 added **MCP (Model Context Protocol)** based AI analysis feature, allowing natural language conversations with news data for deep analysis.
|
||||
|
||||
|
||||
### ⚠️ Important Notice Before Use
|
||||
|
||||
|
||||
**Critical: AI features require local news data support**
|
||||
|
||||
AI analysis **does not** query real-time online data directly, but analyzes **locally accumulated news data** (stored in the `output` folder)
|
||||
|
||||
|
||||
#### Usage Instructions:
|
||||
|
||||
1. **Built-in Test Data**: The `output` directory includes news data from **November 1-15, 2025** by default for quick feature testing
|
||||
|
||||
2. **Query Limitations**:
|
||||
- ✅ Only query data within available date range (Nov 1-15)
|
||||
- ❌ Cannot query real-time news or future dates
|
||||
|
||||
3. **Getting Latest Data**:
|
||||
- Test data is for quick experience only, **recommend deploying the project yourself** to get real-time data
|
||||
- Follow [Quick Start](#-quick-start) to deploy and run the project
|
||||
- After accumulating news data for at least 1 day, you can query the latest trending topics
|
||||
|
||||
---
|
||||
|
||||
### 1. Quick Deployment
|
||||
|
||||
@ -1528,7 +1588,189 @@ Create `.cursor/mcp.json`:
|
||||
|
||||
</details>
|
||||
|
||||
(Additional client configs including VSCode/Cline/Continue, Claude Code CLI, MCP Inspector, and others available in Chinese version)
|
||||
<details>
|
||||
<summary><b>👉 Click to expand: VSCode (Cline/Continue)</b></summary>
|
||||
|
||||
#### Cline Configuration
|
||||
|
||||
Add in Cline's MCP settings:
|
||||
|
||||
**HTTP Mode**:
|
||||
```json
|
||||
{
|
||||
"trendradar": {
|
||||
"url": "http://localhost:3333/mcp",
|
||||
"type": "streamableHttp",
|
||||
"autoApprove": [],
|
||||
"disabled": false
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**STDIO Mode** (Recommended):
|
||||
```json
|
||||
{
|
||||
"trendradar": {
|
||||
"command": "uv",
|
||||
"args": [
|
||||
"--directory",
|
||||
"/path/to/TrendRadar",
|
||||
"run",
|
||||
"python",
|
||||
"-m",
|
||||
"mcp_server.server"
|
||||
],
|
||||
"type": "stdio",
|
||||
"disabled": false
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### Continue Configuration
|
||||
|
||||
Edit `~/.continue/config.json`:
|
||||
```json
|
||||
{
|
||||
"experimental": {
|
||||
"modelContextProtocolServers": [
|
||||
{
|
||||
"transport": {
|
||||
"type": "stdio",
|
||||
"command": "uv",
|
||||
"args": [
|
||||
"--directory",
|
||||
"/path/to/TrendRadar",
|
||||
"run",
|
||||
"python",
|
||||
"-m",
|
||||
"mcp_server.server"
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Usage Examples**:
|
||||
```
|
||||
Analyze recent 7 days "Tesla" popularity trend
|
||||
Generate today's trending summary report
|
||||
Search "Bitcoin" related news and analyze sentiment
|
||||
```
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><b>👉 Click to expand: Claude Code CLI</b></summary>
|
||||
|
||||
#### HTTP Mode Configuration
|
||||
|
||||
```bash
|
||||
# 1. Start HTTP service
|
||||
# Windows: start-http.bat
|
||||
# Mac/Linux: ./start-http.sh
|
||||
|
||||
# 2. Add MCP server
|
||||
claude mcp add --transport http trendradar http://localhost:3333/mcp
|
||||
|
||||
# 3. Verify connection (ensure service started)
|
||||
claude mcp list
|
||||
```
|
||||
|
||||
#### Usage Examples
|
||||
|
||||
```bash
|
||||
# Query news
|
||||
claude "Search today's Zhihu trending news, top 10"
|
||||
|
||||
# Trend analysis
|
||||
claude "Analyze 'artificial intelligence' topic popularity trend for the past week"
|
||||
|
||||
# Data comparison
|
||||
claude "Compare Zhihu and Weibo platform attention on 'Bitcoin'"
|
||||
```
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><b>👉 Click to expand: MCP Inspector</b> (Debug Tool)</summary>
|
||||
<br>
|
||||
|
||||
MCP Inspector is the official debug tool for testing MCP connections:
|
||||
|
||||
#### Usage Steps
|
||||
|
||||
1. **Start TrendRadar HTTP Service**:
|
||||
```bash
|
||||
# Windows
|
||||
start-http.bat
|
||||
|
||||
# Mac/Linux
|
||||
./start-http.sh
|
||||
```
|
||||
|
||||
2. **Start MCP Inspector**:
|
||||
```bash
|
||||
npx @modelcontextprotocol/inspector
|
||||
```
|
||||
|
||||
3. **Connect in Browser**:
|
||||
- Visit: `http://localhost:3333/mcp`
|
||||
- Test "Ping Server" function to verify connection
|
||||
- Check "List Tools" returns 13 tools:
|
||||
- Basic Query: get_latest_news, get_news_by_date, get_trending_topics
|
||||
- Smart Search: search_news, search_related_news_history
|
||||
- Advanced Analysis: analyze_topic_trend, analyze_data_insights, analyze_sentiment, find_similar_news, generate_summary_report
|
||||
- System Management: get_current_config, get_system_status, trigger_crawl
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><b>👉 Click to expand: Other MCP-Compatible Clients</b></summary>
|
||||
<br>
|
||||
|
||||
Any client supporting Model Context Protocol can connect to TrendRadar:
|
||||
|
||||
#### HTTP Mode
|
||||
|
||||
**Service Address**: `http://localhost:3333/mcp`
|
||||
|
||||
**Basic Config Template**:
|
||||
```json
|
||||
{
|
||||
"name": "trendradar",
|
||||
"url": "http://localhost:3333/mcp",
|
||||
"type": "http",
|
||||
"description": "News Trending Aggregation Analysis"
|
||||
}
|
||||
```
|
||||
|
||||
#### STDIO Mode (Recommended)
|
||||
|
||||
**Basic Config Template**:
|
||||
```json
|
||||
{
|
||||
"name": "trendradar",
|
||||
"command": "uv",
|
||||
"args": [
|
||||
"--directory",
|
||||
"/path/to/TrendRadar",
|
||||
"run",
|
||||
"python",
|
||||
"-m",
|
||||
"mcp_server.server"
|
||||
],
|
||||
"type": "stdio"
|
||||
}
|
||||
```
|
||||
|
||||
**Notes**:
|
||||
- Replace `/path/to/TrendRadar` with actual project path
|
||||
- Windows paths use backslash escape: `C:\\Users\\...`
|
||||
- Ensure project dependencies installed (ran setup script)
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
## ☕ FAQ & Support
|
||||
@ -1541,28 +1783,13 @@ Create `.cursor/mcp.json`:
|
||||
<summary><b>👉 Click to expand: Author's Note</b></summary>
|
||||
<br>
|
||||
|
||||
Thanks for all support! Due to sponsor support, the **one-yuan donation** QR code has been removed.
|
||||
Thanks for all support! Due to 302.AI sponsorship, my personal **one-yuan donation** QR code has been removed.
|
||||
|
||||
Previous **one-yuan supporters** are listed in the **Acknowledgments** section at the top.
|
||||
|
||||
This project's development and maintenance require significant time, effort, and costs (including AI model fees). With sponsorship support, I can maintain it more confidently.
|
||||
|
||||
Currently, major AI model prices are relatively affordable. If you don't have a suitable model yet, clicking **302.AI** below also supports the developer:
|
||||
|
||||
<div align="center">
|
||||
|
||||
<span style="margin-left: 10px"><a href="https://share.302.ai/mEOUzG" target="_blank"><img src="_image/icon-302ai.png" alt="302ai logo" width="100"/></a></span>
|
||||
|
||||
</div>
|
||||
|
||||
**Usage Process:**
|
||||
|
||||
1. After registration and top-up, enter [Management Dashboard](https://302.ai/dashboard/overview) at top right
|
||||
2. Click [API Keys](https://302.ai/apis/list) on the left
|
||||
3. Find default API KEY at page bottom, click eye icon to view, then copy (Note: don't click the copy button on the far right)
|
||||
4. Cherry Studio has integrated 302.AI, just fill in API key to use (currently must fill key first to see complete model list)
|
||||
|
||||
If you already have a suitable model, welcome to **register and try**~
|
||||
Currently, major AI model prices are relatively affordable. Welcome to register and try, you can **[click here to claim $1 free credit](#-sponsors)**.
|
||||
|
||||
</details>
|
||||
|
||||
@ -1580,16 +1807,58 @@ If you already have a suitable model, welcome to **register and try**~
|
||||
|
||||
## 🪄 Sponsors
|
||||
|
||||
> 302.AI is a pay-as-you-go enterprise-level AI resource platform
|
||||
> Providing the latest and most comprehensive **AI models** and **APIs** on the market, plus various ready-to-use online AI applications.
|
||||
> **302.AI** is a pay-as-you-go enterprise-level AI resource platform
|
||||
> Providing the latest and most comprehensive **AI models** and **APIs** on the market, plus various ready-to-use online AI applications
|
||||
|
||||
|
||||
<div align="center">
|
||||
|
||||
<span style="margin-left: 10px"><a href="https://share.302.ai/mEOUzG" target="_blank"><img src="_image/banner-302ai-en.jpg" alt="302ai banner" width="800"/></a>
|
||||
|
||||
<a href="https://share.302.ai/mEOUzG" target="_blank">
|
||||
<img src="_image/banner-302ai-en.jpg" alt="302.AI" width="800"/>
|
||||
</a>
|
||||
</div>
|
||||
|
||||
### 💰 302.AI New User Benefits
|
||||
|
||||
> The $1 credit can be used to call various AI models (such as Claude, GPT, etc.)
|
||||
> This project's AI analysis features require AI model integration. See [AI Analysis Deployment](#-ai-analysis-deployment) for configuration tutorial
|
||||
|
||||
[](https://share.302.ai/mEOUzG)
|
||||
|
||||
<details id="sponsor-tutorial">
|
||||
<summary><b>👉 Click to expand: 302.AI Usage Tutorial</b></summary>
|
||||
|
||||
|
||||
### Step 1: Get API Key
|
||||
|
||||
1. After registration, go to [Management Dashboard](https://302.ai/dashboard/overview) at top right
|
||||
2. Click [API Keys](https://302.ai/apis/list) on the left
|
||||
3. Find default API KEY at page bottom, **click eye icon to view**, then copy
|
||||
(⚠️ Note: Don't click the copy button on the far right)
|
||||
|
||||
|
||||
### Step 2: Configure in Cherry Studio
|
||||
|
||||
1. Open Cherry Studio, go to settings
|
||||
2. Select **"302.AI"** as model provider
|
||||
3. Paste the API Key you just copied
|
||||
4. Click **Manage**, now you can use all supported AI models
|
||||
|
||||
**Tip:** Cherry Studio has natively integrated 302.AI, you can see the complete model list after configuration.
|
||||
|
||||
|
||||
**Q: How long does $1 free credit last?**
|
||||
A: Depends on usage frequency and model selection, can run multiple test sessions.
|
||||
|
||||
**Q: What after free credit runs out?**
|
||||
A: You can top up as needed, pay-as-you-go. Major AI model prices are now relatively affordable.
|
||||
|
||||
</details>
|
||||
|
||||
<br>
|
||||
|
||||
---
|
||||
|
||||
|
||||
### Common Questions
|
||||
|
||||
|
||||
107
readme.md
107
readme.md
@ -46,7 +46,7 @@
|
||||
| [🎯 核心功能](#-核心功能) | [🚀 快速开始](#-快速开始) | [🐳 Docker部署](#-docker-部署) | [🤖 AI分析专区](#-ai-智能分析部署) |
|
||||
|:---:|:---:|:---:|:---:|
|
||||
| [📝 更新日志](#-更新日志) | [🔌 MCP客户端](#-mcp-客户端) | [❓ 答疑与公益](#问题答疑与公益捐助) | [⭐ 项目相关](#项目相关) |
|
||||
| [🔧 自定义监控平台](#自定义监控平台) | [📝 frequency_words.txt 配置](#frequencywordstxt-配置教程) | | |
|
||||
| [🔧 自定义监控平台](#自定义监控平台) | [📝 推送关键词配置](#frequencywordstxt-配置教程) | [🪄 赞助商](#-赞助商) | |
|
||||
|
||||
</div>
|
||||
|
||||
@ -517,7 +517,11 @@ weight:
|
||||
- 跨平台数据对比(活跃度统计、关键词共现)
|
||||
- 智能摘要生成、相似新闻查找、历史关联检索
|
||||
|
||||
> 告别手动翻阅数据文件,AI 助手帮你秒懂新闻背后的故事
|
||||
> **💡 使用提示**:AI 功能需要本地新闻数据支持
|
||||
> - 项目自带 **11月1-15日** 测试数据,可立即体验
|
||||
> - 建议自行部署运行项目,获取更实时的数据
|
||||
>
|
||||
> 详见 [AI 智能分析部署](#-ai-智能分析部署)
|
||||
|
||||
### **零技术门槛部署**
|
||||
|
||||
@ -1472,7 +1476,31 @@ docker exec -it trend-radar ls -la /app/config/
|
||||
|
||||
## 🤖 AI 智能分析部署
|
||||
|
||||
TrendRadar v3.0.0 新增了基于 **MCP (Model Context Protocol)** 的 AI 分析功能,让你可以通过自然语言与新闻数据对话,进行深度分析。使用 **AI 功能** 的最佳前提是已使用本项目至少运行一天(积累新闻数据)
|
||||
TrendRadar v3.0.0 新增了基于 **MCP (Model Context Protocol)** 的 AI 分析功能,让你可以通过自然语言与新闻数据对话,进行深度分析。
|
||||
|
||||
|
||||
### ⚠️ 使用前必读
|
||||
|
||||
|
||||
**重要提示:AI 功能需要本地新闻数据支持**
|
||||
|
||||
AI 分析功能**不是**直接查询网络实时数据,而是分析你**本地已积累的新闻数据**(存储在 `output` 文件夹中)
|
||||
|
||||
|
||||
#### 使用说明:
|
||||
|
||||
1. **项目自带测试数据**:`output` 目录默认包含 **2025年11月1日~11月15日** 的新闻数据,可用于快速体验 AI 功能
|
||||
|
||||
2. **查询限制**:
|
||||
- ✅ 只能查询已有日期范围内的数据(11月1-15日)
|
||||
- ❌ 无法查询实时新闻或未来日期
|
||||
|
||||
3. **获取最新数据**:
|
||||
- 测试数据仅供快速体验,**建议自行部署项目**获取实时数据
|
||||
- 按照 [快速开始](#-快速开始) 部署运行项目
|
||||
- 等待至少 1 天积累新闻数据后,即可查询最新热点
|
||||
|
||||
---
|
||||
|
||||
### 1. 快速部署
|
||||
|
||||
@ -1790,35 +1818,17 @@ MCP Inspector 是官方调试工具,用于测试 MCP 连接:
|
||||
|
||||
> 如果你想支持本项目,可通过微信搜索**腾讯公益**,对里面的**助学计划**随心捐助~
|
||||
>
|
||||
> 我还在为信息过载而焦虑,他们却在信息荒漠中挣扎,连学习的机会都没有,所以他们比我更需要支持。
|
||||
> 我还在为信息过载而焦虑,而他们却在信息荒漠中挣扎,他们比我更需要支持。
|
||||
|
||||
<details>
|
||||
<summary><b>👉 点击展开:作者有话说</b></summary>
|
||||
<br>
|
||||
|
||||
感谢各位支持!因获得赞助商赞助,现已移除**一元点赞**打赏码。
|
||||
感谢各位支持!因获得赞助商的赞助,现已移除我个人的**一元点赞**打赏码。
|
||||
|
||||
之前参与**一元点赞**的朋友已收录至顶部**致谢名单**。
|
||||
|
||||
本项目开发和维护投入了大量时间、精力和成本(含 AI 模型费用),有了赞助支持后可以更安心维护。
|
||||
|
||||
目前大厂模型价格已相对亲民,如果你手上暂无合适的模型,点击下方**302.AI**也是对开发者的支持:
|
||||
|
||||
<div align="center">
|
||||
|
||||
<span style="margin-left: 10px"><a href="https://share.302.ai/mEOUzG" target="_blank"><img src="_image/icon-302ai.png" alt="302.AI logo" width="100"/></a></span>
|
||||
|
||||
</div>
|
||||
|
||||
**使用流程:**
|
||||
|
||||
1. 注册并充值后,进入右上角 [管理后台](https://302.ai/dashboard/overview)
|
||||
2. 点击左侧 [API Keys](https://302.ai/apis/list)
|
||||
3. 在页面下方找到默认 API KEY,点击眼睛图标查看,然后复制(注意:不是点最右侧的复制按钮)
|
||||
4. Cherry Studio 已集成 302.AI,直接填入 API 密钥即可使用(当前必须得先填密钥才能看到完整模型列表)
|
||||
|
||||
若你已有合适的模型,也欢迎先**注册体验**~
|
||||
|
||||
</details>
|
||||
|
||||
- **GitHub Issues**:适合针对性强的解答。提问时请提供完整信息(截图、错误日志、系统环境等)。
|
||||
@ -1835,16 +1845,57 @@ MCP Inspector 是官方调试工具,用于测试 MCP 连接:
|
||||
|
||||
## 🪄 赞助商
|
||||
|
||||
> 302.AI 是一个按用量付费的企业级 AI 资源平台
|
||||
> 提供市场上最新、最全面的 **AI模型** 和 **API**,以及多种开箱即用的在线 AI 应用。
|
||||
|
||||
> **302.AI** 是按用量付费的企业级 AI 资源平台
|
||||
> 提供市场上最新、最全面的 **AI 模型**和 **API**,以及多种开箱即用的在线 AI 应用
|
||||
|
||||
<div align="center">
|
||||
|
||||
<span style="margin-left: 10px"><a href="https://share.302.ai/mEOUzG" target="_blank"><img src="_image/banner-302ai-zh.jpg" alt="302ai banner" width="800"/></a>
|
||||
|
||||
<a href="https://share.302.ai/mEOUzG" target="_blank">
|
||||
<img src="_image/banner-302ai-zh.jpg" alt="302.AI" width="800"/>
|
||||
</a>
|
||||
</div>
|
||||
|
||||
### 💰 302.AI 新用户福利
|
||||
|
||||
> 领取的 1 美元可用于调用各种 AI 大模型(如 Claude、GPT 等)
|
||||
> 本项目 AI 分析功能需配置大模型使用,配置教程详见 [AI智能分析部署](#-ai-智能分析部署)
|
||||
|
||||
[](https://share.302.ai/mEOUzG)
|
||||
|
||||
<details id="sponsor-tutorial">
|
||||
<summary><b>👉 点击展开: 302.AI 使用教程</b></summary>
|
||||
|
||||
|
||||
### 第 1 步:获取 API Key
|
||||
|
||||
1. 注册后,进入右上角 [管理后台](https://302.ai/dashboard/overview)
|
||||
2. 点击左侧 [API Keys](https://302.ai/apis/list)
|
||||
3. 在页面下方找到默认 API KEY,**点击眼睛图标查看**,然后复制
|
||||
(⚠️ 注意:不是点最右侧的复制按钮)
|
||||
|
||||
|
||||
### 第 2 步:在 Cherry Studio 中配置
|
||||
|
||||
1. 打开 Cherry Studio,进入设置
|
||||
2. 模型提供商选择 **"302.AI"**
|
||||
3. 粘贴刚才复制的 API Key
|
||||
4. 点击**管理**,现在可以使用所有支持的 AI 模型了
|
||||
|
||||
**提示:** Cherry Studio 已原生集成 302.AI,配置后即可看到完整模型列表。
|
||||
|
||||
|
||||
**Q: 1 美元免费额度能用多久?**
|
||||
A: 取决于使用频率和模型选择,可以进行多次测试体验。
|
||||
|
||||
**Q: 免费额度用完后怎么办?**
|
||||
A: 可以按需充值,按量付费。目前大厂模型价格已相对亲民。
|
||||
|
||||
</details>
|
||||
|
||||
<br>
|
||||
|
||||
---
|
||||
|
||||
|
||||
### 常见问题
|
||||
|
||||
|
||||
Loading…
Reference in New Issue
Block a user