= Deepseek ref [https://hub.docker.com/r/ollama/ollama here][[br]] ref [https://www.kdnuggets.com/using-deepseek-r1-locally here][[br]] To run deepseek locally, we need to install ollama then deepseek-r1:1.5b, deepseek-r1:7b, or deepseek-r1:8b {{{ ~]$ docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama ~]$ docker exec -it ollama ollama run deepseek-r1:8b >>> /? Available Commands: /set Set session variables /show Show model information /load Load a session or model /save Save your current session /clear Clear session context /bye Exit /?, /help Help for a command /? shortcuts Help for keyboard shortcuts Use """ to begin a multi-line message. >>> Send a message (/? for help) >>> what is the answer of x from x^2 - 5x + 6 =0 }}} == Client test with http post We write simple python3 code [attachment:simplePrompt.py] to ask what is the result of 1+1=? [[br]] The other example was to send [attachment:data.csv] file attached to the request [attachment:query_data_csv.py] to analyze the data. All files attached with in this page. [[br]] {{{ #!python import requests # Send a prompt to the Ollama API def ask_ollama(prompt): try: # Ollama API endpoint url = "http://192.168.19.15:11434/api/generate" # Payload for the API request payload = { "model": "deepseek-r1:8b", # Replace with the correct model name "prompt": prompt, "stream": False # Set to True if you want streaming responses } # Send the request to the Ollama API response = requests.post(url, json=payload) # Check if the request was successful if response.status_code == 200: # Parse the JSON response result = response.json() return result.get("response", "No response from model") else: print(f"Error: {response.status_code} - {response.text}") return None except Exception as e: print(f"Error sending request to Ollama: {e}") return None # Main function if __name__ == "__main__": # Define the prompt prompt = "What is 1 + 1?" # Send the prompt to Ollama print(f"Sending prompt: {prompt}") response = ask_ollama(prompt) if response: print("Model Response:") print(response) else: print("Failed to get a response from the model.") }}}