<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:media="http://search.yahoo.com/mrss/" version="2.0"><channel><title>标签：AI - Warren's Blog</title><link>/tags/ai</link><atom:link href="/tags/ai/feed/tags/ai.xml" rel="self" type="application/rss+xml"/><description>IF THERE’S ANY TRUE LOGIC TO THE UNIVERSE… WE’LL END UP ON THAT GAYHUB AGAIN SOMEDAY.</description><generator>Halo v2.22.4</generator><language>zh-cn</language><lastBuildDate>Sun, 3 May 2026 17:59:12 GMT</lastBuildDate><item><title><![CDATA[Agent 工程：Ralph in DeepAgent & Claude Code]]></title><link>/archives/agent-gong-cheng-ralph-in-deepagent-claude-code</link><description><![CDATA[<img src="http://localhost:8090/plugins/feed/assets/telemetry.gif?title=Agent%20%E5%B7%A5%E7%A8%8B%EF%BC%9ARalph%20in%20DeepAgent%20%26%20Claude%20Code&amp;url=/archives/agent-gong-cheng-ralph-in-deepagent-claude-code" width="1" height="1" alt="" style="opacity:0;">背景 本来有很多 Deep Research 的内容和网络资料，想来想去，还是删除了，写点人话。 此文是公司做的一次技术分享，文章 Artifact 有删减。 本文聊聊 2026 年前沿 Agent 技术：Long Horizon / Running 、Ralph Loop 在 LangChain]]></description><guid isPermaLink="false">/archives/agent-gong-cheng-ralph-in-deepagent-claude-code</guid><dc:creator>Warren Zhan</dc:creator><category>工作沉思录</category><category>默认分类</category><pubDate>Wed, 4 Mar 2026 14:41:00 GMT</pubDate></item><item><title><![CDATA[Agent 工程：Langchain & Long Horizon Agent]]></title><link>/archives/agent-gong-cheng-langchain-long-horizon-agent</link><description><![CDATA[<img src="http://localhost:8090/plugins/feed/assets/telemetry.gif?title=Agent%20%E5%B7%A5%E7%A8%8B%EF%BC%9ALangchain%20%26%20Long%20Horizon%20Agent&amp;url=/archives/agent-gong-cheng-langchain-long-horizon-agent" width="1" height="1" alt="" style="opacity:0;">本文是对 Langchain 前不久分享的 Long Horizon Agent 的详细解读。 长周期任务（Long-horizon Task）。 简单说，就是让 Agent 干点大事儿，不是那种问个问题就完事儿的客服，而是能独立思考、规划、执行一整套复杂流程的“数字员工”。比如让它独立完成一个软件]]></description><guid isPermaLink="false">/archives/agent-gong-cheng-langchain-long-horizon-agent</guid><dc:creator>Warren Zhan</dc:creator><category>工作沉思录</category><category>默认分类</category><pubDate>Thu, 5 Feb 2026 15:43:00 GMT</pubDate></item><item><title><![CDATA[Agent 工程: 初探 OpenClaw 的 pi-mono 设计]]></title><link>/archives/agent-gong-cheng-chu-tan-openclaw-de-pi-mono-she-ji</link><description><![CDATA[<img src="http://localhost:8090/plugins/feed/assets/telemetry.gif?title=Agent%20%E5%B7%A5%E7%A8%8B%3A%20%E5%88%9D%E6%8E%A2%20OpenClaw%20%E7%9A%84%20pi-mono%20%E8%AE%BE%E8%AE%A1&amp;url=/archives/agent-gong-cheng-chu-tan-openclaw-de-pi-mono-she-ji" width="1" height="1" alt="" style="opacity:0;">https://github.com/badlogic/pi-mono Pi 给我们描绘了一幅与今天主流 AI Agent 截然不同的蓝图。它不追求成为一个无所不知的“万事通”，而是立志成为一个潜力无限、与你共同成长的“学徒”。 这何尝不是 Ilya 想象中的 AGI 之一呢？ 今天聊聊背后那个更低]]></description><guid isPermaLink="false">/archives/agent-gong-cheng-chu-tan-openclaw-de-pi-mono-she-ji</guid><dc:creator>Warren Zhan</dc:creator><category>工作沉思录</category><pubDate>Mon, 2 Feb 2026 15:09:00 GMT</pubDate></item><item><title><![CDATA[Agent 工程：长时运行设计之 Ralph Agent]]></title><link>/archives/agent-gong-cheng-chang-shi-yun-xing-she-ji-zhi-ralph-agent</link><description><![CDATA[<img src="http://localhost:8090/plugins/feed/assets/telemetry.gif?title=Agent%20%E5%B7%A5%E7%A8%8B%EF%BC%9A%E9%95%BF%E6%97%B6%E8%BF%90%E8%A1%8C%E8%AE%BE%E8%AE%A1%E4%B9%8B%20Ralph%20Agent&amp;url=/archives/agent-gong-cheng-chang-shi-yun-xing-she-ji-zhi-ralph-agent" width="1" height="1" alt="" style="opacity:0;">最近 AI 圈子有点疯，一个叫 Ralph Wiggum 的东西突然就火了，火到什么程度呢？有人说它是 “最接近 AGI 的玩意儿”，有人用它通宵干活，第二天早上醒来发现几个代码仓库都妥妥地建好了。甚至还有人专门为它发了个加密货币 $RALPH。 这名字听着是不是有点耳熟？没错，就是《辛普森一家》里]]></description><guid isPermaLink="false">/archives/agent-gong-cheng-chang-shi-yun-xing-she-ji-zhi-ralph-agent</guid><dc:creator>Warren Zhan</dc:creator><category>工作沉思录</category><pubDate>Wed, 21 Jan 2026 14:37:00 GMT</pubDate></item><item><title><![CDATA[Agent 工程 - 长时运行之 Ralph Agent]]></title><link>/archives/agent-gong-cheng---chang-shi-yun-xing-zhi-ralph-agent</link><description><![CDATA[<img src="http://localhost:8090/plugins/feed/assets/telemetry.gif?title=Agent%20%E5%B7%A5%E7%A8%8B%20-%20%E9%95%BF%E6%97%B6%E8%BF%90%E8%A1%8C%E4%B9%8B%20Ralph%20Agent&amp;url=/archives/agent-gong-cheng---chang-shi-yun-xing-zhi-ralph-agent" width="1" height="1" alt="" style="opacity:0;">最近 AI 圈子有点疯，一个叫 Ralph Wiggum 的东西突然就火了，火到什么程度呢？有人说它是 “最接近 AGI 的玩意儿”，有人用它通宵干活，第二天早上醒来发现几个代码仓库都妥妥地建好了。甚至还有人专门为它发了个加密货币 $RALPH。 这名字听着是不是有点耳熟？没错，就是《辛普森一家》里]]></description><guid isPermaLink="false">/archives/agent-gong-cheng---chang-shi-yun-xing-zhi-ralph-agent</guid><dc:creator>Warren Zhan</dc:creator><category>工作沉思录</category><pubDate>Sun, 18 Jan 2026 10:52:00 GMT</pubDate></item><item><title><![CDATA[RAG 工程： GraphRAG 中的 DeepResearch ? 解读 GRAPHSEARCH 设计]]></title><link>/archives/rag-gong-cheng-graphrag-zhong-de-deepresearch-jie-du-graphsearch-she-ji</link><description><![CDATA[<img src="http://localhost:8090/plugins/feed/assets/telemetry.gif?title=RAG%20%E5%B7%A5%E7%A8%8B%EF%BC%9A%20GraphRAG%20%E4%B8%AD%E7%9A%84%20DeepResearch%20%3F%20%E8%A7%A3%E8%AF%BB%20GRAPHSEARCH%20%E8%AE%BE%E8%AE%A1&amp;url=/archives/rag-gong-cheng-graphrag-zhong-de-deepresearch-jie-du-graphsearch-she-ji" width="1" height="1" alt="" style="opacity:0;">这篇工作是一个工程化的设计思路，基本上和算法没关系。可以理解为是 GraphRAG 检索环节的一种 Agentic 检索设计。感觉类似 GraphRAG 中的 “Deep Research” 是否有参考价值，如果我们能确定现有检索场景下的 BADCASE 的具体原因，可以引入此工程思路，看是否能解决]]></description><guid isPermaLink="false">/archives/rag-gong-cheng-graphrag-zhong-de-deepresearch-jie-du-graphsearch-she-ji</guid><dc:creator>Warren Zhan</dc:creator><category>工作沉思录</category><pubDate>Thu, 15 Jan 2026 11:12:00 GMT</pubDate></item><item><title><![CDATA[RAG 工程：从容应对增量数据 - 解读 ERA-RAG 设计]]></title><link>/archives/rag-gong-cheng-cong-rong-ying-dui-zeng-liang-shu-ju---jie-du-era-rag-she-ji</link><description><![CDATA[<img src="http://localhost:8090/plugins/feed/assets/telemetry.gif?title=RAG%20%E5%B7%A5%E7%A8%8B%EF%BC%9A%E4%BB%8E%E5%AE%B9%E5%BA%94%E5%AF%B9%E5%A2%9E%E9%87%8F%E6%95%B0%E6%8D%AE%20-%20%E8%A7%A3%E8%AF%BB%20ERA-RAG%20%E8%AE%BE%E8%AE%A1&amp;url=/archives/rag-gong-cheng-cong-rong-ying-dui-zeng-liang-shu-ju---jie-du-era-rag-she-ji" width="1" height="1" alt="" style="opacity:0;">最近在为组内的需求做技术调研，在看看 RAG 领域相关的工作。 上一篇我们回顾了经典 RAPTOR 的设计，本次我们接着看看 ERA-RAG 相关的设计思路，这篇文章目前看着没有在任何会议 / 期刊发布，但由于被 RAGFLow 写进了 25 年和 26 年的 ROADMAP，所以可能还是值得一看的]]></description><guid isPermaLink="false">/archives/rag-gong-cheng-cong-rong-ying-dui-zeng-liang-shu-ju---jie-du-era-rag-she-ji</guid><dc:creator>Warren Zhan</dc:creator><category>默认分类</category><pubDate>Thu, 1 Jan 2026 14:35:00 GMT</pubDate></item><item><title><![CDATA[RAG 工程：回味经典 - 解读 RAPTOR 设计]]></title><link>/archives/rag-gong-cheng-hui-wei-jing-dian---jie-du-raptor-s-j</link><description><![CDATA[<img src="http://localhost:8090/plugins/feed/assets/telemetry.gif?title=RAG%20%E5%B7%A5%E7%A8%8B%EF%BC%9A%E5%9B%9E%E5%91%B3%E7%BB%8F%E5%85%B8%20-%20%E8%A7%A3%E8%AF%BB%20RAPTOR%20%E8%AE%BE%E8%AE%A1&amp;url=/archives/rag-gong-cheng-hui-wei-jing-dian---jie-du-raptor-s-j" width="1" height="1" alt="" style="opacity:0;">这两天在做技术调研，顺便回味一下这篇入选 ICLR 2024 的经典工作 传统的 RAG，有点“短视”。 它通常把长文档切成一堆互不相干的小碎片，然后根据你的问题，找出最像的几片。这对于回答“XX 是什么？”这种 factual question（事实性问题）还行。可一旦问题需要你通读全文、理解来龙]]></description><guid isPermaLink="false">/archives/rag-gong-cheng-hui-wei-jing-dian---jie-du-raptor-s-j</guid><dc:creator>Warren Zhan</dc:creator><category>工作沉思录</category><pubDate>Wed, 31 Dec 2025 15:02:00 GMT</pubDate></item><item><title><![CDATA[工程经验 - Agent 工程之 ReAct 实现与思考]]></title><link>/archives/gong-cheng-jing-yan---agent-gong-cheng-zhi-react-shi-xian-yu-si-kao</link><description><![CDATA[<img src="http://localhost:8090/plugins/feed/assets/telemetry.gif?title=%E5%B7%A5%E7%A8%8B%E7%BB%8F%E9%AA%8C%20-%20Agent%20%E5%B7%A5%E7%A8%8B%E4%B9%8B%20ReAct%20%E5%AE%9E%E7%8E%B0%E4%B8%8E%E6%80%9D%E8%80%83&amp;url=/archives/gong-cheng-jing-yan---agent-gong-cheng-zhi-react-shi-xian-yu-si-kao" width="1" height="1" alt="" style="opacity:0;">LLM 自变成风口以来，每天都有很多奇思妙想，其中有一些想法虽然也没有发顶刊，却也非常重要，直到今天还是为人们所津津乐道，例如 COT、 ReAct 都成为了现阶段 Agent 工程中重要的基石。ReAct 的核心在于如何让 LLM 长出手脚，去做事情，属于 Tool Use 领域的一种方案。 原理]]></description><guid isPermaLink="false">/archives/gong-cheng-jing-yan---agent-gong-cheng-zhi-react-shi-xian-yu-si-kao</guid><dc:creator>Warren Zhan</dc:creator><category>工作沉思录</category><pubDate>Thu, 20 Feb 2025 15:22:00 GMT</pubDate></item><item><title><![CDATA[行业杂谈 - Agent 工程之研发的三个方向]]></title><link>/archives/xing-ye-za-tan---agent-gong-cheng-zhi-yan-fa-de-san-ge-fang-xiang</link><description><![CDATA[<img src="http://localhost:8090/plugins/feed/assets/telemetry.gif?title=%E8%A1%8C%E4%B8%9A%E6%9D%82%E8%B0%88%20-%20Agent%20%E5%B7%A5%E7%A8%8B%E4%B9%8B%E7%A0%94%E5%8F%91%E7%9A%84%E4%B8%89%E4%B8%AA%E6%96%B9%E5%90%91&amp;url=/archives/xing-ye-za-tan---agent-gong-cheng-zhi-yan-fa-de-san-ge-fang-xiang" width="1" height="1" alt="" style="opacity:0;">我接触大模型行业已有一年多，姑且自我定位为 LLM Agent 应用研发工程师。在我看来，LLM 就是一把通往未来：软件 2.0 的钥匙，而在 2025，可以认为「Agent 即为 2.0 时代的软件。」 前言 2023 &amp; 2024 有惊喜（GPT-3.5、GPT 4）、有爆点（Cursor、De]]></description><guid isPermaLink="false">/archives/xing-ye-za-tan---agent-gong-cheng-zhi-yan-fa-de-san-ge-fang-xiang</guid><dc:creator>Warren Zhan</dc:creator><category>行业</category><category>工作沉思录</category><pubDate>Mon, 10 Feb 2025 12:08:40 GMT</pubDate></item><item><title><![CDATA[工程经验 - Agent 工程之初探 MCP 协议]]></title><link>/archives/gong-cheng-jing-yan---agent-gong-cheng-zhi-chu-tan-mcp-xie-yi</link><description><![CDATA[<img src="http://localhost:8090/plugins/feed/assets/telemetry.gif?title=%E5%B7%A5%E7%A8%8B%E7%BB%8F%E9%AA%8C%20-%20Agent%20%E5%B7%A5%E7%A8%8B%E4%B9%8B%E5%88%9D%E6%8E%A2%20MCP%20%E5%8D%8F%E8%AE%AE&amp;url=/archives/gong-cheng-jing-yan---agent-gong-cheng-zhi-chu-tan-mcp-xie-yi" width="1" height="1" alt="" style="opacity:0;">一、Minimal MCP Example 直接跑一下一个 MCP 最小实现的案例，体验体验 MCP 是啥 Overview sequenceDiagram participant MCP Client participant MCP Server participant Resources MCP]]></description><guid isPermaLink="false">/archives/gong-cheng-jing-yan---agent-gong-cheng-zhi-chu-tan-mcp-xie-yi</guid><dc:creator>Warren Zhan</dc:creator><category>工作沉思录</category><pubDate>Mon, 27 Jan 2025 13:34:00 GMT</pubDate></item><item><title><![CDATA[工程经验 - Agent 工程之 Function Calling 机制]]></title><link>/archives/gong-cheng-jing-yan---agent-gong-cheng-zhi-function-calling-ji-zhi</link><description><![CDATA[<img src="http://localhost:8090/plugins/feed/assets/telemetry.gif?title=%E5%B7%A5%E7%A8%8B%E7%BB%8F%E9%AA%8C%20-%20Agent%20%E5%B7%A5%E7%A8%8B%E4%B9%8B%20Function%20Calling%20%E6%9C%BA%E5%88%B6&amp;url=/archives/gong-cheng-jing-yan---agent-gong-cheng-zhi-function-calling-ji-zhi" width="1" height="1" alt="" style="opacity:0;">一句话诠释我的理解：“Function Calling ” 让 LLM 有手有脚！ 1 Minimal example Overview 这个案例演示要做的事情如下： sequenceDiagram participant User participant Client participant Op]]></description><guid isPermaLink="false">/archives/gong-cheng-jing-yan---agent-gong-cheng-zhi-function-calling-ji-zhi</guid><dc:creator>Warren Zhan</dc:creator><category>片段集</category><pubDate>Fri, 24 Jan 2025 14:39:00 GMT</pubDate></item><item><title><![CDATA[机器学习 - 揭开神经网络的神秘面纱]]></title><link>/archives/ji-qi-xue-xi---jie-kai-shen-jing-wang-luo-de-shen-mi-mian-sha</link><description><![CDATA[<img src="http://localhost:8090/plugins/feed/assets/telemetry.gif?title=%E6%9C%BA%E5%99%A8%E5%AD%A6%E4%B9%A0%20-%20%E6%8F%AD%E5%BC%80%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C%E7%9A%84%E7%A5%9E%E7%A7%98%E9%9D%A2%E7%BA%B1&amp;url=/archives/ji-qi-xue-xi---jie-kai-shen-jing-wang-luo-de-shen-mi-mian-sha" width="1" height="1" alt="" style="opacity:0;">「纸上得来终觉浅，绝知此事要躬行」 近期拜读了吴恩达在 Coursea 的机器学习经典三剑客前两部分： 有监督的机器学习：回归与分类 高级学习算法]]></description><guid isPermaLink="false">/archives/ji-qi-xue-xi---jie-kai-shen-jing-wang-luo-de-shen-mi-mian-sha</guid><dc:creator>Warren Zhan</dc:creator><category>工作沉思录</category><pubDate>Sun, 18 Aug 2024 08:57:31 GMT</pubDate></item></channel></rss>