wiki:Deepseek

Version 5 (modified by krit, 3 weeks ago) (diff)

--

Deepseek

ref here
ref here
To run deepseek locally, we need to install ollama then deepseek-r1:1.5b, deepseek-r1:7b, or deepseek-r1:8b

~]$ docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
~]$ docker exec -it ollama ollama run deepseek-r1:8b
>>> /?
Available Commands:
  /set            Set session variables
  /show           Show model information
  /load <model>   Load a session or model
  /save <model>   Save your current session
  /clear          Clear session context
  /bye            Exit
  /?, /help       Help for a command
  /? shortcuts    Help for keyboard shortcuts

Use """ to begin a multi-line message.

>>> Send a message (/? for help)

Client test with http post

We write simple python3 code simplePrompt.py to ask what is the result of 1+1=?
The other example was to send data.csv file attached to the request to analyze the data. All files attached with in this page.

import requests

# Send a prompt to the Ollama API
def ask_ollama(prompt):
    try:
        # Ollama API endpoint
        url = "http://192.168.19.15:11434/api/generate"
        
        # Payload for the API request
        payload = {
            "model": "deepseek-r1:8b",  # Replace with the correct model name
            "prompt": prompt,
            "stream": False  # Set to True if you want streaming responses
        }
        
        # Send the request to the Ollama API
        response = requests.post(url, json=payload)
        
        # Check if the request was successful
        if response.status_code == 200:
            # Parse the JSON response
            result = response.json()
            return result.get("response", "No response from model")
        else:
            print(f"Error: {response.status_code} - {response.text}")
            return None
    except Exception as e:
        print(f"Error sending request to Ollama: {e}")
        return None

# Main function
if __name__ == "__main__":
    # Define the prompt
    prompt = "What is 1 + 1?"
    
    # Send the prompt to Ollama
    print(f"Sending prompt: {prompt}")
    response = ask_ollama(prompt)
    
    if response:
        print("Model Response:")
        print(response)
    else:
        print("Failed to get a response from the model.")

Attachments (3)

Download all attachments as: .zip