🧠
  1. Download the code

    git clone https://github.com/IlyaGusev/holosophos.git
    cd holosophos
    
  2. Create a .env file at the project root. Typical entries (keys below are dummy):

    # LLM provider keys (use what you need)
    OPENROUTER_API_KEY=sk-or-v1-b8caassdasdasdasdasd978063a
    
    # Tooling keys
    TAVILY_API_KEY=tvly-dev-oasdasdasdasdasdkb5F
    VAST_AI_KEY=4b3448aaaasdasdasdasdas359c31a4c144b
    
    # Optional observability
    PHOENIX_URL=http://localhost:6006
    PHOENIX_PROJECT_NAME=holosophos
    
    # MCP endpoints when running locally
    ACADEMIA_MCP_URL=http://0.0.0.0:5056/mcp
    MLE_KIT_MCP_URL=http://0.0.0.0:5057/mcp
    
    # App options
    MODEL_NAME=openai/gpt-5
    PORT=5055
    

    You will need an actual OPENROUTER_API_KEY, which you will get after registration.

    [synced_block is not supported]

    `TAVILY_API_KEY` should be obtained for free by registration on [https://www.tavily.com](https://www.tavily.com/)
    

    VAST_AI_KEY is optional if you would like to use an external server with GPU for computations (recommended).

  3. Create a folder workdir in the main folder

  4. Run inside the main folder

    docker compose up --build
    

    If you run this locally on your machine, you will be able to go to http://localhost:5055 for the application (unless a different port is specified)

  5. Run this in the main folder

    uv run python -m codearkt.terminal
    
  6. Write your prompt for research here, then press Esc, then Enter. For example:

    Draft a 4-page research paper on the connection between ODE splitting schemes and optimization algorithms with:
    - Abstract, Introduction, Methods, Results, Discussion, Conclusion
    - ~1200–1600 words (≈4 pages), clear sectioning
    - 2 figures (placeholders OK) and 1 table
    - ~10 references (APA/ACM style)
    Use LaTeX and compile to PDF. Save all sources and the PDF under /workdir/paper.
    
    A short description is here:
    
    Different ODE splitting schemes induce distinct optimization dynamics; deriving optimizers from higher-order and asymmetric splits can give us new optimization algorithms
    
    in the paper
    
    https://arxiv.org/pdf/2004.08981 
    
    
    The authors present a different view on stochastic optimization, which goes back to the
    splitting schemes for approximate solutions of ODE. In this work, the authors provide a
    connection between stochastic gradient descent approach and first-order splitting
    scheme for ODE. The authors consider the special case of splitting, which is inspired by
    machine learning applications and derive a new upper bound on the global splitting
    error for it. The authors present, that the Kaczmarz method is the limit case of the splitting
    scheme for the unit batch SGD for the linear least squares problem. The authors support
    our findings with systematic empirical studies, which demonstrates, that a more
    accurate solution of local problems leads to the stepsize robustness and provides
    better convergence in time and iterations on the softmax regression problem.
    
    let us study another splitting schemes and derive different optimziation algorithms based on them?
    
    Carefully track experiment time and perform only those experiments you can do, starting from vert small scales
    
    ## Experimental rules:
    Ensure, that measuring starts before running different methods, to ensure, that they started from the same initialization point.
    Fix seeds, consider small linear regression, logistic regression, softmax regression problems, ideally focus on real world dataset, rather than synthetic one
    

Settings: holosophos/settings.py. You can adjust: MODEL_NAME, MAX_COMPLETION_TOKENS, max_iterations for each agent, etc.

Instructions are located at: holosophos/prompts/*.yaml

How to use your own latex template? Put your .sty and a base .tex skeleton into workdir/paper/ . By default, it uses agents4science latex template, but you may prompt it to use your custom template.