1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100
| def _add_to_vector_store(self, messages, metadata, filters, infer=True): """ 智能推理模式:自动判断 ADD/UPDATE/DELETE """
system_prompt, user_prompt = get_fact_retrieval_messages(...)
is_agent_memory = self._should_use_agent_memory_extraction(messages, metadata)
response = self.llm.generate_response( messages=[ {"role": "system", "content": system_prompt}, {"role": "user", "content": user_prompt}, ], response_format={"type": "json_object"}, )
new_retrieved_facts = json.loads(response)["facts"]
for new_mem in new_retrieved_facts: messages_embeddings[new_mem] = self.embedding_model.embed(new_mem, "add")
existing_memories = self.vector_store.search.search( query=new_mem, vectors=messages_embeddings, top_k=5, filters=search_filters, )
for mem in existing_memories: retrieved_old_memory.append({ "id": mem.id, "text": mem.payload.get("data", "") })
function_calling_prompt = get_update_memory_messages( retrieved_old_memory, new_retrieved_facts, self.config.custom_update_memory_prompt )
response: str = self.llm.generate_response( messages=[{"role": "user", "content": function_calling_prompt}], response_format={"type": "json_object"}, )
for resp in new_memories_with_actions.get("memory", []): event_type = resp.get("event")
if event_type == "ADD": memory_id = self._create_memory( data=resp["text"], existing_embeddings=new_message_embeddings, metadata=deepcopy(metadata), )
elif event_type == "UPDATE": memory_id = _resolve_mapped_id(temp_uuid_mapping, resp, "UPDATE") self._update_memory( memory_id=memory_id, data=resp["text"], existing_embeddings=new_message_embeddings, metadata=deepcopy(metadata), )
elif event_type == "DELETE": memory_id = _resolve_mapped_id(temp_uuid_mapping, resp, "DELETE") self._delete_memory(memory_id=memory_id)
|