Quickstart
Install memary
2nd Option
You can also install memary locally:
i. Create a virtual environment with the python version set as specified above
ii. Install python dependencies:
Specify models used
Local OS Support
At the time of writing, memary assumes installation of local models and we currently support all models available through Ollama:
- LLM running locally using Ollama (
Llama 3 8B/40B
as suggested defaults) ORgpt-3.5-turbo
- Vision model running locally using Ollama (
LLaVA
as suggested default) ORgpt-4-vision-preview
memary will default to the locally run models unless explicitly specified. Additionally, memary allows developers to easily switch between downloaded models.
Run memary
-
(Optional) If running models locally using Ollama, follow the instructions in this repo.
-
Ensure that an
.env
exists with any necessary API keys and Neo4j credentials
.env
-
Update user persona which can be found in
streamlit_app/data/user_persona.txt
using the user persona template which can be found instreamlit_app/data/user_persona_template.txt
. Instructions have been provided for customization - replace the curly brackets with relevant information. -
(Optional) Update system persona, if needed, which can be found in
streamlit_app/data/system_persona.txt
.
More Basic Functionality
from memary.agent.chat_agent import ChatAgent
system_persona_txt = "data/system_persona.txt"
user_persona_txt = "data/user_persona.txt"
past_chat_json = "data/past_chat.json"
memory_stream_json = "data/memory_stream.json"
entity_knowledge_store_json = "data/entity_knowledge_store.json"
chat_agent = ChatAgent(
"Personal Agent",
memory_stream_json,
entity_knowledge_store_json,
system_persona_txt,
user_persona_txt,
past_chat_json,
)
Agent Configuration
Pass in subset of the template tools ['search', 'vision', 'locate', 'stocks']
as include_from_defaults
for different set of default tools upon initialization. Tools can be imported to configure the agents capabilties.
Adding Custom Tools
def multiply(a: int, b: int) -> int:
"""Multiply two integers and returns the result integer"""
return a * b
chat_agent.add_tool({"multiply": multiply})
ReAct Custom Tools
More information about creating custom tools for the LlamaIndex ReAct Agent can be found here.