In this tutorial, we present an advanced AI agent built using Nebius' The robust ecosystem, in particular the components of chatnebius, nebiuembeddings and nebiuusretriever. The agent uses the LLAMA-3.3-70B-ISTRUCT-FAST model to generate high-quality responses, incorporating external features such as Wikipedia research, recovery of contextual documents and safe mathematical calculation. By combining the fast structured design with the modular framework of Langchain, this tutorial shows how to build a multi-functional AI assistant and compatible with reasoning which is both interactive and extensible. Whether for scientific requests, technological information or basic digital tasks, this agent presents the potential of Nebius as a platform to build sophisticated AI systems.
!pip install -q langchain-nebius langchain-core langchain-community wikipedia
import os
import getpass
from typing import List, Dict, Any
import wikipedia
from datetime import datetime
We start by installing essential libraries, including Langchain-Nebius, Langchain-Core, Langchain-Community and Wikipedia, which are crucial to building an AI assistant rich in functionalities. It is then important the necessary modules such as the operating system, GetPass, DateTime and the entry of utilities, and initializes the Wikipedia API for access to external data.
from langchain_core.documents import Document
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser
from langchain_core.runnables import RunnablePassthrough
from langchain_core.tools import tool
from langchain_nebius import ChatNebius, NebiusEmbeddings, NebiusRetriever
if "NEBIUS_API_KEY" not in os.environ:
os.environ("NEBIUS_API_KEY") = getpass.getpass("Enter your Nebius API key: ")
We import the main components of Langchain and Nebius to allow documents management, prompt models, output analysis and tool integration. It sets up key classes such as Chatnebius for language modeling, nebiusembes for vector representation and nebiusretriever for semantic research. The user API Nebius key is safely accessible using GetPass to authenticate the subsequent API interactions.
class AdvancedNebiusAgent:
"""Advanced AI Agent with retrieval, reasoning, and external tool capabilities"""
def __init__(self):
self.llm = ChatNebius(model="meta-llama/Llama-3.3-70B-Instruct-fast")
self.embeddings = NebiusEmbeddings()
self.knowledge_base = self._create_knowledge_base()
self.retriever = NebiusRetriever(
embeddings=self.embeddings,
docs=self.knowledge_base,
k=3
)
self.agent_prompt = ChatPromptTemplate.from_template("""
You are an advanced AI assistant with access to:
1. A knowledge base about technology and science
2. Wikipedia search capabilities
3. Mathematical calculation tools
4. Current date/time information
Context from knowledge base:
{context}
External tool results:
{tool_results}
Current date: {current_date}
User Query: {query}
Instructions:
- Use the knowledge base context when relevant
- If you need additional information, mention what external sources would help
- Be comprehensive but concise
- Show your reasoning process
- If calculations are needed, break them down step by step
Response:
""")
def _create_knowledge_base(self) -> List(Document):
"""Create a comprehensive knowledge base"""
return (
Document(
page_content="Artificial Intelligence (AI) is transforming industries through ML, NLP, and computer vision. Key applications include autonomous vehicles, medical diagnosis, and financial trading.",
metadata={"topic": "AI", "category": "technology"}
),
Document(
page_content="Quantum computing uses quantum mechanical phenomena like superposition and entanglement to process information. Companies like IBM, Google, and Microsoft are leading quantum research.",
metadata={"topic": "quantum_computing", "category": "technology"}
),
Document(
page_content="Climate change is caused by greenhouse gas emissions, primarily CO2 from fossil fuels. Renewable energy sources are crucial for mitigation.",
metadata={"topic": "climate", "category": "environment"}
),
Document(
page_content="CRISPR-Cas9 is a revolutionary gene editing technology that allows precise DNA modifications. It has applications in treating genetic diseases and improving crops.",
metadata={"topic": "biotechnology", "category": "science"}
),
Document(
page_content="Blockchain technology enables decentralized, secure transactions without intermediaries. Beyond cryptocurrency, it has applications in supply chain, healthcare, and voting systems.",
metadata={"topic": "blockchain", "category": "technology"}
),
Document(
page_content="Space exploration has advanced with reusable rockets, Mars rovers, and commercial space travel. SpaceX, Blue Origin, and NASA are pioneering new missions.",
metadata={"topic": "space", "category": "science"}
),
Document(
page_content="Renewable energy costs have dropped dramatically. Solar & wind power are now cheaper than fossil fuels in many regions, driving global energy transition.",
metadata={"topic": "renewable_energy", "category": "environment"}
),
Document(
page_content="5G networks provide ultra-fast internet speeds and low latency, enabling IoT devices, autonomous vehicles, and augmented reality applications.",
metadata={"topic": "5G", "category": "technology"}
)
)
@tool
def wikipedia_search(query: str) -> str:
"""Search Wikipedia for additional information"""
try:
search_results = wikipedia.search(query, results=3)
if not search_results:
return f"No Wikipedia results found for '{query}'"
page = wikipedia.page(search_results(0))
summary = wikipedia.summary(search_results(0), sentences=3)
return f"Wikipedia: {page.title}n{summary}nURL: {page.url}"
except Exception as e:
return f"Wikipedia search error: {str(e)}"
@tool
def calculate(expression: str) -> str:
"""Perform mathematical calculations safely"""
try:
allowed_chars = set('0123456789+-*/.() ')
if not all(c in allowed_chars for c in expression):
return "Error: Only basic mathematical operations allowed"
result = eval(expression)
return f"Calculation: {expression} = {result}"
except Exception as e:
return f"Calculation error: {str(e)}"
def _format_docs(self, docs: List(Document)) -> str:
"""Format retrieved documents for context"""
if not docs:
return "No relevant documents found in knowledge base."
formatted = ()
for i, doc in enumerate(docs, 1):
formatted.append(f"{i}. {doc.page_content}")
return "n".join(formatted)
def _get_current_date(self) -> str:
"""Get current date and time"""
return datetime.now().strftime("%Y-%m-%d %H:%M:%S")
def process_query(self, query: str, use_wikipedia: bool = False,
calculate_expr: str = None) -> str:
"""Process a user query with optional external tools"""
relevant_docs = self.retriever.invoke(query)
context = self._format_docs(relevant_docs)
tool_results = ()
if use_wikipedia:
wiki_keywords = self._extract_keywords(query)
if wiki_keywords:
wiki_result = self.wikipedia_search(wiki_keywords)
tool_results.append(f"Wikipedia Search: {wiki_result}")
if calculate_expr:
calc_result = self.calculate(calculate_expr)
tool_results.append(f"Calculation: {calc_result}")
tool_results_str = "n".join(tool_results) if tool_results else "No external tools used"
chain = (
{
"context": lambda x: context,
"tool_results": lambda x: tool_results_str,
"current_date": lambda x: self._get_current_date(),
"query": RunnablePassthrough()
}
| self.agent_prompt
| self.llm
| StrOutputParser()
)
return chain.invoke(query)
def _extract_keywords(self, query: str) -> str:
"""Extract key terms for Wikipedia search"""
important_words = ()
stop_words = {'what', 'how', 'why', 'when', 'where', 'is', 'are', 'the', 'a', 'an'}
words = query.lower().split()
for word in words:
if word not in stop_words and len(word) > 3:
important_words.append(word)
return ' '.join(important_words(:3))
def interactive_session(self):
"""Run an interactive session with the agent"""
print("🤖 Advanced Nebius AI Agent Ready!")
print("Features: Knowledge retrieval, Wikipedia search, calculations")
print("Commands: 'wiki:' for Wikipedia, 'calc:' for math")
print("Type 'quit' to exitn")
while True:
user_input = input("You: ").strip()
if user_input.lower() == 'quit':
print("Goodbye!")
break
use_wiki = False
calc_expr = None
if user_input.startswith('wiki:'):
use_wiki = True
user_input = user_input(5:).strip()
elif user_input.startswith('calc:'):
parts = user_input.split(':', 1)
if len(parts) == 2:
calc_expr = parts(1).strip()
user_input = f"Calculate {calc_expr}"
try:
response = self.process_query(user_input, use_wiki, calc_expr)
print(f"n🤖 Agent: {response}n")
except Exception as e:
print(f"Error: {e}n")
The heart of the implementation is encapsulated in the Advancednebiusagent class, which orchestrates reasoning, recovery and integration of tools. It initializes a high performance LLM from Nebius (Meta-Lalma / Llama-3.3-70B-Instruct-Fast). It sets up a semantic retriever based on integrated documents, forming a mini knowledge base which covers subjects such as AI, quantum computer, blockchain, etc. A dynamic prompt model guides the agent's responses by including the recovered context, external tools outputs and the current date. Two integrated tools, wikipedia_search and calculate, improve the agent's functionality by giving access to external encyclopedic knowledge and safe arithmetic calculation respectively. The Process_Query method brings together everything, dynamically invoking the fast chain with the context, the tools and the reasoning to generate informative and multi-sources responses. An optional interactive session allows real -time conversations with the agent, allowing the recognition of special prefixes such as Wiki: or Calc: to activate the external management of the tool.
if __name__ == "__main__":
agent = AdvancedNebiusAgent()
demo_queries = (
"What is artificial intelligence and how is it being used?",
"Tell me about quantum computing companies",
"How does climate change affect renewable energy adoption?"
)
print("=== Nebius AI Agent Demo ===n")
for i, query in enumerate(demo_queries, 1):
print(f"Demo {i}: {query}")
response = agent.process_query(query)
print(f"Response: {response}n")
print("-" * 50)
print("nDemo with Wikipedia:")
response_with_wiki = agent.process_query(
"What are the latest developments in space exploration?",
use_wikipedia=True
)
print(f"Response: {response_with_wiki}n")
print("Demo with calculation:")
response_with_calc = agent.process_query(
"If solar panel efficiency improved by 25%, what would be the new efficiency if current is 20%?",
calculate_expr="20 * 1.25"
)
print(f"Response: {response_with_calc}n")
Finally, we present the agent's capacities through a set of demonstration requests. It begins by instantiating the Advancednebiusagement, followed by a loop which deals with the predefined prompts linked to AI, quantum IT and climate change, demonstrating the recovery functionality. He then makes an improved request by Wikipedia on space exploration, using real -time external information to complete the knowledge base. Finally, he performs a mathematical scenario involving the effectiveness of the solar panel to validate the calculation tool. These demos collectively illustrate how Nebius, combined with Langchain and well -structured prompts, allows an intelligent multimodal request manipulation in a real assistant.
In conclusion, this agent propelled by the Nébile illustrates how to effectively combine LLM -focused reasoning with a structured recovery use and external tools to build a competent and compatible assistant. By integrating Langchain with the Nebius APIs, the agent accesses an organized knowledge base, recovers live data from Wikipedia and manages arithmetic operations with security checks. The modular architecture of the tutorial, with fast models, dynamic chaining and customizable entries, provides a powerful plan for developers who seek to create intelligent systems that exceed static static static Great language model (LLM) Answers.
Discover the Codes. All the merit of this research goes to researchers in this project. Also, don't hesitate to follow us Twitter And don't forget to join our Subseubdredit 100k + ml and subscribe to Our newsletter.
Asif Razzaq is the CEO of Marktechpost Media Inc .. as a visionary entrepreneur and engineer, AIF undertakes to exploit the potential of artificial intelligence for social good. His most recent company is the launch of an artificial intelligence media platform, Marktechpost, which stands out from its in-depth coverage of automatic learning and in-depth learning news which are both technically solid and easily understandable by a large audience. The platform has more than 2 million monthly views, illustrating its popularity with the public.
