Model Card for Model ID

How to Use

!pip install bitsandbytes
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

# **Model Name on Hugging Face**
MODEL_NAME = "Vijayendra/DeepSeek-Qwen2.5-14B-DeepThinker-v2"

# 🛠 **Load Model & Tokenizer from Hugging Face**
tokenizer = AutoTokenizer.from_pretrained(MODEL_NAME)
model = AutoModelForCausalLM.from_pretrained(
    MODEL_NAME, 
    device_map="auto",  # Automatically assigns model layers to available GPUs/CPUs
    torch_dtype=torch.float16  # Use 16-bit precision for memory efficiency
).to("cuda" if torch.cuda.is_available() else "cpu")  # Send model to GPU if available

# 🛠 **Define Inference Function**
def generate_response(model, tokenizer, prompt, max_new_tokens=3200, temperature=0.7):
    
    # Tokenize input
    inputs = tokenizer(prompt, return_tensors="pt", padding=True, truncation=True).to(model.device)
    # Ensure attention mask is passed
    attention_mask = inputs.attention_mask
    # Generate response
    with torch.no_grad():
        generated_tokens = model.generate(
            inputs.input_ids,
            attention_mask=inputs.attention_mask,  # Ensure attention mask is passed
            max_new_tokens=max_new_tokens,
            temperature=temperature,
            do_sample=True,
            top_k=40,
            top_p=0.9,
            eos_token_id=tokenizer.eos_token_id,
            pad_token_id=tokenizer.pad_token_id
        )

    # Decode response
    return tokenizer.decode(generated_tokens[0], skip_special_tokens=True)

# **Test Questions**
questions = [
    "If a time traveler goes back in time and prevents their own birth, how do they exist to prevent their own birth? Given this paradox, is time travel logically consistent with causality? Explain whether such an event is possible under any known physical theory.",
    "What if the Earth had no axial tilt? Describe the long-term effects on climate, ecosystems, and human civilization. Would technological and agricultural progress have evolved differently?",
    "A number sequence follows this pattern: 2, 6, 12, 20, 30, 42, ...What is the 50th term, and what is the general formula for the nth term?",
    "If an AI model were to become self-aware, how would it know it is self-aware? Could an AI ever prove its own consciousness to a human observer? Discuss using examples from philosophy and neuroscience."
]

# **Generate and Print Responses**
for i, question in enumerate(questions, 1):
    response = generate_response(model, tokenizer, question)
    print(f"\n🟢 Question {i}: {question}")
    print(f"🔵 Response: {response}")



Downloads last month
21
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Vijayendra/DeepSeek-Qwen2.5-14B-DeepThinker-v2

Adapter
(5)
this model