feat: 关闭 Datadog 日志发送

This commit is contained in:
claude-code-best 2026-04-03 09:49:59 +08:00
parent e32c159f35
commit 78144b4dba
3 changed files with 36 additions and 4 deletions

View File

@ -1,5 +1,22 @@
# DEV-LOG
## Datadog 日志端点可配置化 (2026-04-03)
将 Datadog 硬编码的 Anthropic 内部端点改为环境变量驱动,默认禁用。
**修改文件:**
| 文件 | 变更 |
|------|------|
| `src/services/analytics/datadog.ts` | `DATADOG_LOGS_ENDPOINT``DATADOG_CLIENT_TOKEN` 从硬编码常量改为读取 `process.env.DATADOG_LOGS_ENDPOINT` / `process.env.DATADOG_API_KEY`,默认空字符串;`initializeDatadog()` 增加守卫:端点或 Token 未配置时直接返回 `false` |
| `docs/telemetry-remote-config-audit.md` | 更新第 1 节,反映新的环境变量配置方式 |
**效果:** 默认不向任何外部发送数据;设置两个环境变量即可接入自己的 Datadog 实例。原有 `DISABLE_TELEMETRY`、privacy level、sink killswitch 等防线保留。
**用法:** `DATADOG_LOGS_ENDPOINT=https://http-intake.logs.datadoghq.com/api/v2/logs DATADOG_API_KEY=xxx bun run dev`
---
## Sentry 错误上报集成 (2026-04-03)
恢复反编译过程中被移除的 Sentry 集成。通过 `SENTRY_DSN` 环境变量控制,未设置时所有函数为 no-op不影响正常运行。

View File

@ -4,12 +4,13 @@
**文件**: `src/services/analytics/datadog.ts`
- **端点**: `https://http-intake.logs.us5.datadoghq.com/api/v2/logs`
- **客户端 token**: `pubbbf48e6d78dae54bceaa4acf463299bf`
- **端点**: 通过环境变量 `DATADOG_LOGS_ENDPOINT` 配置(默认为空,即禁用)
- **客户端 token**: 通过环境变量 `DATADOG_API_KEY` 配置(默认为空,即禁用)
- **行为**: 批量发送日志15s flush 间隔100 条上限),仅限 1P直连 Anthropic API用户
- **事件白名单**: `tengu_*` 系列事件启动、错误、OAuth、工具调用等 ~35 种)
- **基线数据**: 收集 model、platform、arch、version、userBucket用户 hash 到 30 个桶)等
- **仅限**: `NODE_ENV === 'production'`
- **配置示例**: `DATADOG_LOGS_ENDPOINT=https://http-intake.logs.datadoghq.com/api/v2/logs DATADOG_API_KEY=xxx bun run dev`
## 2. 1P 事件日志BigQuery

View File

@ -9,9 +9,17 @@ import { MODEL_COSTS } from '../../utils/modelCost.js'
import { isAnalyticsDisabled } from './config.js'
import { getEventMetadata } from './metadata.js'
/**
* Datadog endpoint and token are configurable via environment variables.
* If neither is set, Datadog logging is disabled entirely (no data sent).
*
* DATADOG_LOGS_ENDPOINT=https://http-intake.logs.datadoghq.com/api/v2/logs
* DATADOG_API_KEY=<your-key>
*/
const DATADOG_LOGS_ENDPOINT =
'https://http-intake.logs.us5.datadoghq.com/api/v2/logs'
const DATADOG_CLIENT_TOKEN = 'pubbbf48e6d78dae54bceaa4acf463299bf'
process.env.DATADOG_LOGS_ENDPOINT ?? ''
const DATADOG_CLIENT_TOKEN =
process.env.DATADOG_API_KEY ?? ''
const DEFAULT_FLUSH_INTERVAL_MS = 15000
const MAX_BATCH_SIZE = 100
const NETWORK_TIMEOUT_MS = 5000
@ -133,6 +141,12 @@ export const initializeDatadog = memoize(async (): Promise<boolean> => {
return false
}
// No custom endpoint configured — skip Datadog entirely
if (!DATADOG_LOGS_ENDPOINT || !DATADOG_CLIENT_TOKEN) {
datadogInitialized = false
return false
}
try {
datadogInitialized = true
return true