Skip to content

fixed temperature argument parsing #1619

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 104 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
104 commits
Select commit Hold shift + click to select a range
1fd6ebf
fix: relax starlette version requirement to resolve FastAPI conflict
umayado17 Oct 27, 2024
67cd175
Merge pull request #1506 from umayado17/fix/starlette-dependency-conf…
KillianLucas Oct 28, 2024
5d95824
Add "Policies" group to mint.json with legal/license page.
MikeBirdTech Oct 31, 2024
f27d39d
Added Gemini 1.5 Pro profile to Open Interpreter defaults.
MikeBirdTech Oct 31, 2024
3b56f9e
Merge pull request #1516 from MikeBirdTech/gemini-profile
KillianLucas Oct 31, 2024
6ada942
Merge pull request #1515 from MikeBirdTech/privacy-policy
KillianLucas Oct 31, 2024
a50be90
Added Open Interpreter profile for Llama 3.2:3b served locally by Cor…
MikeBirdTech Nov 1, 2024
7907b56
Update cortex-llama32.py to include Cortex configuration and usage in…
MikeBirdTech Nov 1, 2024
66c895b
Incoming
KillianLucas Nov 3, 2024
459208e
Incoming
KillianLucas Nov 4, 2024
2fc7afa
"Added Open Interpreter profile for Anthropic's Claude 3.5 Haiku model."
MikeBirdTech Nov 4, 2024
026623d
Incoming
KillianLucas Nov 4, 2024
c551078
improve docstring
MikeBirdTech Nov 4, 2024
03bd91f
update model name
MikeBirdTech Nov 4, 2024
085e483
Merge pull request #1523 from MikeBirdTech/haiku-profile
KillianLucas Nov 5, 2024
86b61e3
`--model` support
KillianLucas Nov 5, 2024
cbf6077
File editor for non Anthropic LLMs
KillianLucas Nov 5, 2024
af73992
Refactor
KillianLucas Nov 6, 2024
c4b91b1
Incoming
KillianLucas Nov 7, 2024
abc4b31
Incoming
KillianLucas Nov 7, 2024
a6edab6
`a`
KillianLucas Nov 7, 2024
b36f4e0
GUI
KillianLucas Nov 7, 2024
87cf230
Spec
KillianLucas Nov 8, 2024
4b72690
New arguments
KillianLucas Nov 9, 2024
2121316
Incoming
KillianLucas Nov 9, 2024
bde2f6b
Incoming
KillianLucas Nov 9, 2024
20f2fec
Multi-line input
KillianLucas Nov 9, 2024
37e344c
Tool rendering fixes, unused mintlify docs generator (super cool)
KillianLucas Nov 9, 2024
9f7a06e
Profiles
KillianLucas Nov 9, 2024
8f2e22f
In chat messages
KillianLucas Nov 9, 2024
2c2996b
Error reporting
KillianLucas Nov 10, 2024
4880911
Spinner management
KillianLucas Nov 10, 2024
4d7b2ee
Package management
KillianLucas Nov 11, 2024
0506db5
Async spinner, help text
KillianLucas Nov 11, 2024
46c5591
Non Anthropic support
KillianLucas Nov 11, 2024
8c15fa9
Merge pull request #1520 from MikeBirdTech/llama32-cortex
KillianLucas Nov 11, 2024
e793c81
Better profiles, script improvements
KillianLucas Nov 11, 2024
84a7512
Improved rendering, fixed screenshots
KillianLucas Nov 12, 2024
20bc10f
Working server for non-Anthropic LLMs
KillianLucas Nov 12, 2024
6a7ee0c
Force server to use LiteLLM
KillianLucas Nov 12, 2024
194772a
Max turns
KillianLucas Nov 12, 2024
a5290cd
Whitespace error fix and better piping
KillianLucas Nov 12, 2024
bee3f3c
Better profiles, commands, system message
KillianLucas Nov 14, 2024
55db632
Better profiles, commands, system message, ioctl fixes
KillianLucas Nov 14, 2024
fa26518
Better shell integration
KillianLucas Nov 14, 2024
c0bc124
Restored `wtf`, added shell uninstall
KillianLucas Nov 14, 2024
8514f0d
Fixed ctrl C
KillianLucas Nov 14, 2024
69c6302
Construct system message with try excepts
KillianLucas Nov 15, 2024
11fd63b
Support local models, non tool calling models, restricted commands
KillianLucas Nov 17, 2024
0ea9198
wtf help command
KillianLucas Nov 17, 2024
9d25164
Max tokens
KillianLucas Nov 17, 2024
a9e4178
Improved CLI
KillianLucas Nov 18, 2024
ca90a22
Improved CLI, pass through params to LiteLLM
KillianLucas Nov 18, 2024
ab97a24
Allow for --save
KillianLucas Nov 18, 2024
f6b6d7c
Fix input() control chars
KillianLucas Nov 21, 2024
99701b9
Fixed terminal width errors, restored placeholder
KillianLucas Nov 22, 2024
dbd3838
Fixed terminal width error
KillianLucas Nov 22, 2024
a045361
Added INTERPRETER_SIMPLE_BASH option
KillianLucas Nov 22, 2024
e01a33a
Added INTERPRETER_SIMPLE_BASH option
KillianLucas Nov 22, 2024
7cc404c
Work without screen, default to anthropic
KillianLucas Nov 23, 2024
bdfe9aa
Non anthropic OS mode
KillianLucas Nov 25, 2024
bd6441a
Better Python API, better KeyboardInturrupt
KillianLucas Dec 2, 2024
2751057
Archived Interpreter Classic
KillianLucas Dec 2, 2024
c365866
New README draft
KillianLucas Dec 2, 2024
1510d85
Better tool output for non Anthropic models
KillianLucas Dec 3, 2024
848252c
Potential syntax error fix
KillianLucas Dec 3, 2024
eae8cf2
Potential syntax error fix
KillianLucas Dec 3, 2024
798c35e
New installer
KillianLucas Dec 3, 2024
cdec706
Downgrade httpx, fix tool outputs
KillianLucas Dec 4, 2024
0abaa2b
Tests
KillianLucas Dec 4, 2024
3247cd8
New Installer
KillianLucas Dec 4, 2024
cb84bd6
New workflow
KillianLucas Dec 4, 2024
f2258e2
Update new-installer.sh to use --system flag for pip installation
KillianLucas Dec 4, 2024
c23f56d
Enhance new-installer.sh to install Python via uv and remove --system…
KillianLucas Dec 4, 2024
0f41a14
Added back --system
KillianLucas Dec 4, 2024
539fe98
Venv
KillianLucas Dec 4, 2024
ce8de37
Create a virtual environment in a dedicated directory and update the …
KillianLucas Dec 4, 2024
662edb8
Better ubuntu
KillianLucas Dec 4, 2024
3c12690
No windows yet
KillianLucas Dec 4, 2024
b1a854e
Load interpreter inside installer
KillianLucas Dec 5, 2024
19979c8
Load interpreter inside installer
KillianLucas Dec 5, 2024
6ca8a26
Pillow vers
KillianLucas Dec 5, 2024
9a8e034
Don't override provider
KillianLucas Dec 6, 2024
a85094d
add better request handling to openai-compatible server
benxu3 Dec 7, 2024
0d63af5
Merge branch 'development' into development
benxu3 Dec 7, 2024
93e0a0c
remove debug print statements
benxu3 Dec 9, 2024
909e5af
remove unused current task
benxu3 Dec 9, 2024
928da86
Merge branch 'development' of https://github.com/benxu3/open-interpre…
benxu3 Dec 9, 2024
d437f1d
Merge pull request #1549 from benxu3/development
KillianLucas Dec 10, 2024
19b9a9d
Improve tool output handling
KillianLucas Dec 10, 2024
969bc11
Temporarily always use simple bash
KillianLucas Dec 10, 2024
fa0a1ea
Initialize colorama for Windows compatibility in MarkdownRenderer and…
Notnaton Jan 18, 2025
ce106f9
Add colorama dependency for enhanced Windows compatibility
Notnaton Jan 18, 2025
f38dff6
Merge pull request #1580 from OpenInterpreter/Windows-ui-fix
KillianLucas Jan 24, 2025
568502e
Remove unreachable server initialization
endolith Mar 24, 2025
737d603
Print helpful OpenAI-compatible server info
endolith Mar 24, 2025
f126d5e
Define version number in one place
endolith Mar 24, 2025
b118d96
Don't check equality with None
endolith Mar 25, 2025
018eb2d
Delete OI classic archive
endolith Mar 30, 2025
8cd1eb9
Merge pull request #1614 from endolith/remove_archive
Notnaton Mar 30, 2025
7fe61da
Merge pull request #1609 from endolith/none_equality
Notnaton Mar 30, 2025
da0662b
Merge pull request #1606 from endolith/server
Notnaton Mar 30, 2025
fa06bfd
Merge pull request #1608 from endolith/version_number
Notnaton Mar 30, 2025
6ee9091
fixed temperature argument parsing
Apr 16, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
32 changes: 16 additions & 16 deletions .github/workflows/python-package.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,31 +5,31 @@ on:
branches: ["main", "development"]
pull_request:
branches: ["main", "development"]
workflow_dispatch:

jobs:
build:
runs-on: ubuntu-latest
test:
name: Test on ${{ matrix.os }}
runs-on: ${{ matrix.os }}
strategy:
fail-fast: true
matrix:
python-version: ["3.10", "3.12"]
os: [ubuntu-latest, macos-latest] # figure out windows-latest later

steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v3
- uses: actions/checkout@v4
with:
python-version: ${{ matrix.python-version }}
- name: Install poetry
run: |
curl -sSL https://install.python-poetry.org | python3 -
- name: Install dependencies
fetch-depth: 0

- name: Install Open Interpreter
shell: bash
run: |
# Ensure dependencies are installed without relying on a lock file.
poetry update
poetry install -E server
- name: Test with pytest
curl https://raw.githubusercontent.com/OpenInterpreter/open-interpreter/refs/heads/development/installers/new-installer.sh | sh

- name: Run tests
shell: bash
run: |
poetry run pytest -s -x -k test_
interpreter run tests/ -v --color=yes
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
PYTHONUNBUFFERED: "1"
4 changes: 2 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -230,8 +230,8 @@ nix/
# Ignore the replit.nix configuration file
replit.nix

# Ignore misc directory
misc/
# Ignore top level misc directory
/misc/

# Ignore litellm_uuid.txt
litellm_uuid.txt
Expand Down
12 changes: 3 additions & 9 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,14 +1,9 @@
###########################################################################################
# This Dockerfile runs an LMC-compatible websocket server at / on port 8000. #
# To learn more about LMC, visit https://docs.openinterpreter.com/protocols/lmc-messages. #
###########################################################################################

FROM python:3.11.8

# Set environment variables
# ENV OPENAI_API_KEY ...

ENV HOST 0.0.0.0
ENV INTERPRETER_HOST 0.0.0.0
# ^ Sets the server host to 0.0.0.0, Required for the server to be accessible outside the container

# Copy required files into container
Expand All @@ -20,8 +15,7 @@ COPY poetry.lock pyproject.toml README.md ./
# Expose port 8000
EXPOSE 8000

# Install server dependencies
RUN pip install ".[server]"
RUN pip install "."

# Start the server
ENTRYPOINT ["interpreter", "--server"]
ENTRYPOINT ["interpreter", "--serve"]
Loading