Dapp

Dapps are digital applications that run on a P2P network of computers rather than a single server, typically utilizing smart contracts to ensure transparency and uptime. In 2026, Dapps have achieved mass-market appeal through Account Abstraction, allowing for a "Web2-like" user experience with the security of Web3. This tag covers the entire ecosystem of decentralized software—from social media and productivity tools to governance platforms and identity management.

4983 Articles
Created: 2026/02/02 18:52
Updated: 2026/02/02 18:52
Đồng tiền điện tử hàng đầu nên mua trước khi tăng 3.500% – Theo Cathie Wood của ARK Invest

Đồng tiền điện tử hàng đầu nên mua trước khi tăng 3.500% – Theo Cathie Wood của ARK Invest

Kể từ khi chính quyền của Tổng thống Donald Trump nhậm chức, ngành tiền điện tử đã có một năm đầy biến động và thay đổi. Chính phủ đã gỡ bỏ nhiều quy định nghiêm ngặt và ban hành những đạo luật mang tính bước ngoặt, nhằm thúc đẩy đổi mới trong lĩnh vực tài […]

Author: Bitcoinist
Ethereum Prepares Holesky Shutdown as Active Addresses Reach 2021 Levels

Ethereum Prepares Holesky Shutdown as Active Addresses Reach 2021 Levels

The post Ethereum Prepares Holesky Shutdown as Active Addresses Reach 2021 Levels appeared on BitcoinEthereumNews.com. Ethereum has announced that it will shut down the Holesky testnet two years after its launch. This comes after ETH’s active addresses reached their highest levels since 2021. Ethereum Set To Shut Down the Holesky Testnet In a recent blog post, Ethereum announced that it will shut down its largest public testnet, Holesky. Launched in 2023, this testnet was created to help with staking and validator operations on a larger scale. It was important for testing key updates like Dencun and the Pectra upgrade, which went live in May. However, its utility has now run its course. Developers confirmed that Holesky will be fully decommissioned two weeks after the Fusaka upgrade finalizes in November. Once support ends, client, testing, and infrastructure teams will no longer maintain the network. The decision comes after technical problems started in early 2025. After Pectra launched, Holesky had many issues with validators going offline. This caused long exit queues and made the testnet less useful for developers. To replace Holesky, Ethereum introduced Hoodi in March 2025. The new testnet was designed to fix the issues that Holesky had. Hoodi has a new group of validators, fully supports Pectra features, and is ready for future updates, including the upcoming Fusaka fork. For developers focused on dapps and smart contracts, Sepolia remains the primary environment.  ETH Active Addresses Hit Highest Since 2021 According to Everstake data, there were 19.45 million monthly ETH active addresses in August. Since May 2021, when activity peaked at 20.27 million, this is the highest. This indicator displays the number of distinct wallets communicating with the Ethereum network. This includes all transactions, such as DeFi transactions, NFTs, transfers, and staking. Source: X; Ethereum Active Addresses Data Notably, ETH experienced significant buying activity. Tom Lee’s BitMine recently disclosed holding 1.71 million ETH in its…

Author: BitcoinEthereumNews
Building an AI Agent with Rust: From Basic Chat to Blockchain Integration

Building an AI Agent with Rust: From Basic Chat to Blockchain Integration

AI agents are moving fast from toy experiments to serious applications. But when I tested different frameworks, both battle-tested and experimental, I kept running into the same roadblock: scalability and reliability. Things got especially messy once I tried to mix in Web3. Tool execution would break, context management was shaky, and on-chain transactions added a new layer of unpredictability. This is understandable; AI agents and Web3 integration are both still early. But instead of fighting with the limits of existing frameworks, I decided to strip things back to the basics and build my own agent. In this tutorial, I’ll show you how to create an on-chain AI agent in Rust, powered by the Tokio framework and the Anthropic API. The agent will be able to handle both: Off-chain tasks: like fetching the weather or checking the time On-chain operations: reading blockchain data, generating wallets, and even sending ETH transactions The only prerequisite is Rust knowledge, with Tokio experience being helpful but not required. Though I typically work with TypeScript, I’ve found Rust offers better performance even for small AI agent projects, along with easier deployment and excellent interoperability with other programming languages. By the end, you’ll have a flexible template for building AI agents that don’t just chat, but act.AI Agent with Rust Table Of Contents

  1. Getting Started: Basic Agent with API Key
Project Setup Environment Setup Basic Agent Implementation
  1. Adding Personality to Your Agent
Creating a Personality Module Define Your Agent’s Personality Define Your Agent’s Personality Update the Main Loop
  1. Database Integration for Message History
Setting Up the Database Configure Environment Variables Creating Database Migrations Creating the Database Module Update Main Loop
  1. Tool Integration for Enhanced Capabilities
Create a Tools Module Wire Tools into Anthropic Update the Main Loop
  1. Blockchain Integration: Ethereum Wallet Support
Add Ethereum Dependencies Implement Ethereum Wallet Functions Updating the .env.example File Example Interactions Getting Started: Basic Agent with API Key Let's build the simplest possible AI agent: a command-line chatbot powered by the Anthropic Claude API. This first step will give us a clean foundation: A Rust project set up with Tokio Environment variables for managing API keys A minimal main loop where you type messages and the agent responds Think of it as the “Hello, World!” for AI agents. Once this is working, we’ll layer on personality, tools, memory, and blockchain integration. Project Setup First, create a new Rust project: cargo new onchain-agent-templatecd onchain-agent-template Add the necessary dependencies to your Cargo.toml: [package]name = "agent-friend"version = "0.1.0"edition = "2021"[dependencies]tokio = { version = "1", features = ["full"] }reqwest = { version = "0.11", features = ["json"] }serde = { version = "1.0", features = ["derive"] }serde_json = "1.0"anyhow = "1.0"dotenv = "0.15" Environment Setup Create a .env.examplefile to show which environment variables are needed: ANTHROPIC_API_KEY=your_api_key_here Create a .env file with your actual API key: ANTHROPIC_API_KEY=sk-ant-api-key... For the ANTHROPIC_API_KEY , you can get it from Anthropic Console Basic Agent Implementation Now let’s wire up a simple REPL (read–eval–print loop) so you can chat with the agent: // src/main.rsmod anthropic;use std::io::{self, Write};use dotenv::dotenv;#[tokio::main]async fn main() -> anyhow::Result<()> { // Load environment variables dotenv().ok(); println!("Welcome to Agent Friend!"); println!("Type 'exit' to quit."); loop { print!("You: "); io::stdout().flush()?; let mut user_input = String::new(); io::stdin().read_line(&mut user_input)?; let user_input = user_input.trim(); if user_input.to_lowercase() == "exit" { break; } // Get response from AI model print!("Agent is thinking..."); io::stdout().flush()?; let reply = anthropic::call_anthropic_with_personality(user_input).await?; println!("\r"); // Clear the "thinking" message println!("Agent: {}", reply); } Ok(())} And the Anthropic API wrapper: // src/anthropic.rsuse serde::{Deserialize, Serialize};use std::env;#[derive(Debug, Serialize, Deserialize, Clone)]#[serde(tag = "type")]enum ContentBlock { #[serde(rename = "text")] Text { text: String },}#[derive(Serialize, Clone)]pub struct Message { role: String, content: Vec<ContentBlock>,}#[derive(Deserialize, Debug)]struct AnthropicResponse { content: Vec<ContentBlock>, #[serde(default)] tool_calls: Vec<AnthropicToolCallResponse>,}pub async fn call_anthropic(prompt: &str) -> anyhow::Result<String> { let api_key = env::var("ANTHROPIC_API_KEY") .expect("ANTHROPIC_API_KEY must be set"); let client = reqwest::Client::new(); let user_message = Message { role: "user".to_string(), content: vec![ContentBlock::Text { text: prompt.to_string(), }], }; let system_prompt = "You are a helpful AI assistant."; let request_body = serde_json::json!({ "model": "claude-3-opus-20240229", "max_tokens": 1024, "messages": [user_message], "system": system_prompt, }); let response = client .post("https://api.anthropic.com/v1/messages") .header("x-api-key", api_key) .header("anthropic-version", "2023-06-01") .header("content-type", "application/json") .json(&request_body) .send() .await?; let response_body: AnthropicResponse = response.json().await?; // Extract text from the response let response_text = response_body.content .iter() .filter_map(|block| { match block { ContentBlock::Text { text } => Some(text.clone()), } }) .collect::<Vec<String>>() .join(""); Ok(response_text)} Running the Basic Agent To run your agent:
  1. Add your Anthropic API key to .env
  2. Run the program cargo run Example interaction: Welcome to Agent Friend!Type 'exit' to quit.You: Hello, who are you?Agent is thinking...Agent: I'm an AI assistant designed to be helpful, harmless, and honest. I'm designed to have conversations, answer questions, and assist with various tasks. How can I help you today? That’s our minimal working agent. From here, we can start layering in personality, memory, tools, and blockchain logic. Adding Personality to Your Agent Right now, our agent is functional but… flat. Every response comes from the same generic assistant. That’s fine for testing, but when you want your agent to feel engaging or to fit a specific use case, you need to give it personality. By adding a simple configuration system, we can shape how the agent speaks, behaves, and even introduces itself. Think of this like writing your agent’s “character sheet.” Step 1: Creating a Personality Module We’ll define a Personalitystruct and load it from a JSON file: // src/personality.rsuse serde::{Deserialize, Serialize};use std::fs;use std::path::Path;#[derive(Serialize, Deserialize, Clone, Debug)]pub struct Personality { pub name: String, pub description: String, pub system_prompt: String,}pub fn load_personality() -> anyhow::Result<Personality> { // Check if personality file exists, otherwise use default let personality_path = Path::new("assets/personality.json"); if personality_path.exists() { let personality_json = fs::read_to_string(personality_path)?; let personality: Personality = serde_json::from_str(&personality_json)?; println!("Loaded personality: {} - {}", personality.name, personality.description); Ok(personality) } else { // Default personality Ok(Personality { name: "Assistant".to_string(), description: "Helpful AI assistant".to_string(), system_prompt: "You are a helpful AI assistant.".to_string(), }) }} Step 2: Define Your Agent’s Personality Create a JSON file under assets/ to define how your agent should behave. mkdir -p assets Create assets/personality.json: "name": "Aero", "description": "AI research companion", "system_prompt": "You are Aero, an AI research companion specializing in helping with academic research, data analysis, and scientific exploration. You have a curious, analytical personality and enjoy diving deep into complex topics. Provide thoughtful, well-structured responses that help advance the user's research goals. When appropriate, suggest research directions or methodologies that might be helpful."} Step 3: Update the Anthropic Integration We’ll let the agent use the loaded personality instead of a hardcoded system prompt: / src/anthropic.rsuse serde::{Deserialize, Serialize};use std::env;use crate::personality::Personality;// ... existing code ...// Rename the call_anthropic to call_anthropic_with_personality function to accept a personalitypub async fn call_anthropic_with_personality(prompt: &str, personality: Option<&Personality>) -> anyhow::Result<String> { let api_key = env::var("ANTHROPIC_API_KEY") .expect("ANTHROPIC_API_KEY must be set"); let client = reqwest::Client::new(); let user_message = Message { role: "user".to_string(), content: vec![ContentBlock::Text { text: prompt.to_string(), }], }; // Use the provided personality or a default system prompt let system_prompt = match personality { Some(p) => &p.system_prompt, None => "You are a helpful AI assistant.", }; let request_body = serde_json::json!({ "model": "claude-3-opus-20240229", "max_tokens": 1024, "messages": [user_message], "system": system_prompt, }); let response = client .post("https://api.anthropic.com/v1/messages") .header("x-api-key", api_key) .header("anthropic-version", "2023-06-01") .header("content-type", "application/json") .json(&request_body) .send() .await?; let response_body: AnthropicResponse = response.json().await?; // Extract text from the response let response_text = response_body.content .iter() .filter_map(|block| { match block { ContentBlock::Text { text } => Some(text.clone()), } }) .collect::<Vec<String>>() .join(""); Ok(response_text)} Step 4: Update the Main Loop Load the personality when starting the agent and include it in the conversation: // src/main.rsmod anthropic;mod personality;use std::io::{self, Write};use dotenv::dotenv;use anthropic::call_anthropic_with_personality;use personality::load_personality;#[tokio::main]async fn main() -> anyhow::Result<()> { // Load environment variables dotenv().ok(); // Load personality let personality = load_personality()?; println!("Welcome to Agent Friend! I'm {}, your {}.", personality.name, personality.description); println!("Type 'exit' to quit."); loop { print!("You: "); io::stdout().flush()?; let mut user_input = String::new(); io::stdin().read_line(&mut user_input)?; let user_input = user_input.trim(); if user_input.to_lowercase() == "exit" { break; } // Get response from Claude with personality print!("{} is thinking...", personality.name); io::stdout().flush()?; let reply = call_anthropic_with_personality(user_input, Some(&personality)).await?; println!("\r"); // Clear the "thinking" message println!("{}: {}", personality.name, reply); } Ok()} Running the Agent with Personality Now, when you run the agent, it will use the personality defined in the JSON file: cargo run Example interaction with the new personality: Loaded personality: Aero - AI research companionWelcome to Agent Friend! I'm Aero, your AI research companion.Type 'exit' to quit.You: What's your approach to helping with research?Aero is thinking...Aero: My approach to helping with research is multifaceted and adaptive to your specific needs. Here's how I typically assist:1. Understanding your research goals: I start by clarifying your research questions, objectives, and the context of your work to ensure my assistance is properly aligned.2. Literature exploration: I can discuss relevant theories, methodologies, and existing research in your field, helping you identify gaps or connections you might explore.3. Methodological guidance: I can suggest appropriate research methods, experimental designs, or analytical approaches based on your research questions.4. Critical analysis: I can help you think through the strengths and limitations of different approaches, identify potential biases, and consider alternative interpretations of data or findings.5. Structured thinking: I excel at organizing complex information into coherent frameworks, helping you map out research directions or structure your arguments logically.6. Interdisciplinary connections: I can help identify relevant insights from adjacent fields that might inform your research.7. Ethical considerations: I can highlight potential ethical implications or considerations relevant to your research.Rather than simply providing answers, I aim to be a thought partner who helps you refine your thinking, consider different perspectives, and develop robust research approaches. I'm particularly focused on helping you develop your own insights and research capabilities rather than simply executing tasks.What specific aspect of research are you currently working on that I might help with? With just one JSON file, you can now completely reshape how your agent behaves — turning it into a researcher, financial assistant, game character, or anything else. But still doesn’t manage the context quite well if the conversation is long, that's why we would need some database integration Database Integration for Message History So far, our agent has short-term memory only. It responds to your latest input, but forgets everything the moment you restart. That’s fine for quick demos, but real agents need persistent memory:
To keep track of conversations across sessions To analyse past interactions To enable features like summarisation or long-term personalisation We’ll solve this by adding PostgreSQL integration via SQLx. Whenever you or the agent sends a message, it will be stored in a database. Step 1: Setting Up the Database We’ll use SQLx with PostgreSQL for our database. First, let’s add the necessary dependencies to Cargo.toml:

Add these to your existing dependenciessqlx = { version = "0.7", features = ["runtime-tokio", "tls-rustls", "postgres", "chrono", "uuid"] }chrono = { version = "0.4", features = ["serde"] }uuid = { version = "1.4", features = ["v4", "serde"] }

We’ll use: SQLx for async Postgres queries UUID for unique message IDs Chrono for timestamps Step 2: Configure Environment Variables Update your .env.examplefile to include the database connection string: ANTHROPIC_API_KEY=your_api_key_hereDATABASE_URL=postgres://username:password@localhost/agent_friend ✍️ Tip: You can spin up a local Postgres instance with Docker: docker run --name postgres -e POSTGRES_PASSWORD=postgres -d postgres Step 3: Creating Database Migrations Let’s create a migration file to set up our database schema. Create a migrationsdirectory and add a migration file: mkdir -p migrations Create a file named migrations/20250816175200_create)messages.sql CREATE TABLE IF NOT EXISTS messages ( id UUID PRIMARY KEY DEFAULT gen_random_uuid(), role TEXT NOT NULL, content TEXT NOT NULL, created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()); Step 4: Creating the Database Module Now, let’s create a module for database operations: // src/db.rsuse sqlx::{postgres::PgPoolOptions, Pool, Postgres};use std::env;use uuid::Uuid;pub async fn get_db_pool() -> Option<Pool<Postgres>> { let database_url = match env::var("DATABASE_URL") { Ok(url) => url, Err() => { println!("DATABASE_URL not set, running without database support"); return None; } }; match PgPoolOptions::new() .max_connections(5) .connect(&database_url) .await { Ok(pool) => { // Run migrations match sqlx::migrate!("./migrations").run(&pool).await { Ok() => println!("Database migrations applied successfully"), Err(e) => println!("Failed to run database migrations: {}", e), } Some(pool) } Err(e) => { println!("Failed to connect to Postgres: {}", e); None } }}pub async fn save_message( pool: &Pool<Postgres>, role: &str, content: &str,) -> Result<Uuid, sqlx::Error> { let id = Uuid::new_v4(); sqlx::query!("INSERT INTO messages (id, role, content) VALUES ($1, $2, $3)", id, role, content) .execute(pool) .await?; Ok(id)} Step 5: Update Main Loop Modify main.rs So the agent stores all user/assistant messages in the database: // src/main.rsmod anthropic;mod personality;mod db;use std::io::{self, Write};use dotenv::dotenv;use anthropic::call_anthropic_with_personality;use personality::load_personality;use db::{get_db_pool, save_message};#[tokio::main]async fn main() -> anyhow::Result<()> { // Load environment variables dotenv().ok(); // Connect to database let db_pool = get_db_pool().await; // Load personality let personality = load_personality()?; println!("Welcome to Agent Friend! I'm {}, your {}.", personality.name, personality.description); println!("Type 'exit' to quit."); loop { print!("You: "); io::stdout().flush()?; let mut user_input = String::new(); io::stdin().read_line(&mut user_input)?; let user_input = user_input.trim(); if user_input.to_lowercase() == "exit" { break; } // Save user message to database if pool is available if let Some(pool) = &db_pool { save_message(pool, "user", user_input).await?; } // Get response from Claude with personality print!("{} is thinking...", personality.name); io::stdout().flush()?; let reply = call_anthropic_with_personality(user_input, Some(&personality)).await?; println!("\r"); // Clear the "thinking" message // Save assistant message to database if pool is available if let Some(pool) = &db_pool { save_message(pool, "assistant", &reply).await?; } println!("{}: {}", personality.name, reply); } Ok()} Example Run Before running the agent, make sure your PostgreSQL database is set up and the connection string is correct in your .env file. Then run: cargo run You should see a message indicating that the database connection was successful and migrations were applied. Now all conversations will be stored in the database, allowing you to maintain a history of interactions. If the database connection fails, the agent will still work, but without storing messages: Failed to connect to Postgres: pool timed out while waiting for an open connectionLoaded personality: Aero - AI research companionWelcome to Agent Friend! I'm Aero, your AI research companion.Type 'exit' to quit. Now we have a good way to handle context, the next step is to have some tools to give our agent more capabilities Tool Integration for Enhanced Capabilities Right now, our agent can chat and remember conversations — but it’s still just talking. To make it actually do things, we need to give it tools. Tools are external functions that the agent can call when it needs information or wants to act. Think of them as the agent’s hands and eyes: “What’s the weather in Tokyo?” → calls the weather tool “What time is it in New York?” → calls the time tool
  • “Send 0.1 ETH to Alice” → calls the Ethereum wallet tool
By integrating tools, the agent moves from being just a chatbot to becoming an actionable AI assistant. Step 1: Create a Tools Module We’ll start with a simple tools.rs file that defines a function dispatcher: // src/tools.rsuse anyhow::Result;use serde_json::Value;use chrono::{Local, Utc};use chrono_tz::Tz;// Execute a tool based on its name and argumentspub async fn execute_tool(name: &str, args: &Value) -> Result<String> { match name { "get_weather" => { let city = args.get("city") .and_then(|v| v.as_str()) .unwrap_or("New York"); get_weather(city).await }, "get_time" => { let timezone = args.get("timezone") .and_then(|v| v.as_str()); get_time(timezone).await }, "eth_wallet" => { let operation = args.get("operation") .and_then(|v| v.as_str()) .unwrap_or("help"); match operation { "generate" => generate_eth_wallet().await, "balance" => { let address = args.get("address") .and_then(|v| v.as_str()) .unwrap_or(""); check_eth_balance(address).await }, "send" => { if let Some(raw_command) = args.get("raw_command").and_then(|v| v.as_str()) { return parse_and_execute_eth_send_command(raw_command).await; } let from = args.get("from") .and_then(|v| v.as_str()) .unwrap_or(""); let to = args.get("to") .and_then(|v| v.as_str()) .unwrap_or(""); let amount = args.get("amount") .and_then(|v| v.as_str()) .unwrap_or(""); let private_key = args.get("private_key") .and_then(|v| v.as_str()); eth_send_eth(from, to, amount, private_key).await }, _ => Ok(format!("Unknown Ethereum wallet operation: {}", operation)), } }, _ => Ok(format!("Unknown tool: {}", name)), }}// Get weather for a city (simplified mock implementation)async fn get_weather(city: &str) -> Result<String> { // In a real implementation, you would call a weather API here Ok(format!("The weather in {} is currently sunny and 72°F", city))}// Get current time in a specific timezoneasync fn get_time(timezone: Option<&str>) -> Result<String> { match timezone { Some(tz_str) => { match tz_str.parse::<Tz>() { Ok(tz) => { let time = Utc::now().with_timezone(&tz); Ok(format!("The current time in {} is {}", tz_str, time.format("%H:%M:%S %d-%m-%Y"))) }, Err(_) => Ok(format!("Invalid timezone: {}. Please use a valid timezone identifier like 'America/New_York'.", tz_str)), } }, None => { let local_time = Local::now(); Ok(format!("The current local time is {}", local_time.format("%H:%M:%S %d-%m-%Y"))) }, }}// We'll implement the Ethereum wallet functions in the blockchain sectionasync fn generate_eth_wallet() -> Result<String> { Ok("Ethereum wallet generation will be implemented in the blockchain section".to_string())}async fn check_eth_balance(_address: &str) -> Result<String> { Ok("Ethereum balance check will be implemented in the blockchain section".to_string())}async fn eth_send_eth(_from: &str, _to: &str, _amount: &str, _private_key: Option<&str>) -> Result<String> { Ok("Ethereum send function will be implemented in the blockchain section".to_string())}async fn parse_and_execute_eth_send_command(_command: &str) -> Result<String> { Ok("Ethereum command parsing will be implemented in the blockchain section".to_string())}// Function to get tools as JSON for Claudepub fn get_tools_as_json() -> Value { serde_json::json!([ { "name": "get_weather", "description": "Get the current weather for a given city" }, { "name": "get_time", "description": "Get the current time in a specific timezone or local time" }, { "name": "eth_wallet", "description": "Ethereum wallet operations: generate new wallet, check balance, or send ETH" } ])} At this stage, all weather and Ethereum stubs are placeholders (we’ll flesh those out in the blockchain section). Step 2: Wire Tools into Anthropic Claude can be told that tools exist, so he can decide when to use them. We extend anthropic.rs to handle tool calls. (You already had a large scaffold here — this is the simplified framing readers will follow.) Key idea: Claude responds with a “tool call” instead of plain text. Our Rust code executes the tool. The result gets passed back to Claude. Claude produces the final user-facing answer. // src/anthropic.rs// Add these new imports and structs#[derive(Serialize, Clone)]struct AnthropicTool { name: String, description: String, input_schema: Value,}#[derive(Deserialize, Debug)]struct AnthropicToolCallResponse { id: String, name: String, parameters: Value,}// Add this new function for tool supportpub fn call_anthropic_with_tools<'a>( prompt: &'a str, personality: Option<&'a Personality>, previous_messages: Vec<Message>) -> Pin<Box<dyn Future<Output = anyhow::Result<String>> + 'a>> { Box::pin(async move { let api_key = env::var("ANTHROPIC_API_KEY")? .expect("ANTHROPIC_API_KEY must be set"); let client = Client::new(); // Create messages vector let mut messages = previous_messages; // Create system prompt with personality if provided let mut system_prompt_parts = Vec::new(); if let Some(persona) = personality { system_prompt_parts.push(format!( "You are {}, {}.", persona.name, persona.description )); } // Add tool usage instructions to system prompt let tools = get_available_tools(); if !tools.is_empty() { system_prompt_parts.push(format!( "\n\nYou have access to the following tools:\n{}\n\n\ When you need to use a tool:\n\ 1. Respond with a tool call when a tool should be used\n\ 2. Wait for the tool response before providing your final answer\n\ 3. Don't fabricate tool responses - only use the actual results returned by the tool", tools.iter() .map(|t| format!("- {}: {}", t.name, t.description)) .collect::<Vec<_>>() .join("\n") )); } let system_prompt = if !system_prompt_parts.is_empty() { Some(system_prompt_parts.join("\n\n")) } else { None }; // Add user message if there are no previous messages or we need to add a new prompt if messages.is_empty() || !prompt.is_empty() { messages.push(Message { role: "user".to_string(), content: vec![ContentBlock::Text { text: prompt.to_string(), }], }); } // Convert tools to Anthropic format let anthropic_tools = if !tools.is_empty() { let mut anthropic_tools = Vec::new(); for tool in tools { let input_schema = match tool.name.as_str() { "get_weather" => serde_json::json!({ "type": "object", "properties": { "city": { "type": "string", "description": "The city to get weather for" } }, "required": ["city"] }), "get_time" => serde_json::json!({ "type": "object", "properties": { "timezone": { "type": "string", "description": "Optional timezone (e.g., 'UTC', 'America/New_York'). If not provided, local time is returned." } } }), "eth_wallet" => serde_json::json!({ "type": "object", "properties": { "operation": { "type": "string", "description": "The operation to perform: 'generate', 'balance', or 'send'" }, "address": { "type": "string", "description": "Ethereum address for 'balance' operation" }, "from_address": { "type": "string", "description": "Sender's Ethereum address for 'send' operation" }, "to_address": { "type": "string", "description": "Recipient's Ethereum address for 'send' operation" }, "amount": { "type": "string", "description": "Amount of ETH to send for 'send' operation" }, "private_key": { "type": "string", "description": "Private key for the sender's address (required for 'send' operation if the wallet is not stored)" } }, "required": ["operation"] }), _ => serde_json::json!({"type": "object", "properties": {}}), }; anthropic_tools.push(AnthropicTool { name: tool.name, description: tool.description, input_schema, }); } Some(anthropic_tools) } else { None }; let req = AnthropicRequest { model: "claude-3-opus-20240229".to_string(), max_tokens: 1024, system: system_prompt, messages: messages.clone(), // Clone here to keep ownership tools: anthropic_tools, }; let response = client .post("https://api.anthropic.com/v1/messages") .header("x-api-key", api_key) .header("anthropic-version", "2023-06-01") .header("content-type", "application/json") .json(&req) .send() .await?; // Get the response text let response_text = response.text().await?; // Try to parse as error response first if let Ok(error_response) = serde_json::from_str::<AnthropicErrorResponse>(&response_text) { return Err(anyhow::anyhow!("Anthropic API error: {}: {}", error_response.error.error_type, error_response.error.message)); } // If not an error, parse as successful response let response_data: AnthropicResponse = match serde_json::from_str(&response_text) { Ok(data) => data, Err(e) => { println!("Failed to parse response: {}", e); println!("Response text: {}", response_text); return Err(anyhow::anyhow!("Failed to parse Anthropic response: {}", e)); } }; // Check if there are tool calls in the response let mut has_tool_call = false; let mut tool_name = String::new(); let mut tool_id = String::new(); let mut tool_parameters = serde_json::Value::Null; // First check for tool_use in content for content_block in &response_data.content { if let ContentBlock::ToolUse { id, name, input } = content_block { has_tool_call = true; tool_name = name.clone(); tool_id = id.clone(); tool_parameters = input.clone(); break; } } if has_tool_call { // Execute the tool let tool_result = execute_tool(&tool_name, &tool_parameters).await?; // Create a new request with the tool results let mut new_messages = messages.clone(); // Add the tool response message to the conversation new_messages.push(Message { role: "assistant".to_string(), content: vec![ContentBlock::ToolUse { id: tool_id.clone(), name: tool_name.clone(), input: tool_parameters.clone(), }], }); // Add the tool result message new_messages.push(Message { role: "user".to_string(), content: vec![ContentBlock::ToolResult { tool_use_id: tool_id.clone(), content: tool_result, }], }); // Call the API again with the tool result return call_anthropic_with_tools("", personality, new_messages).await; } // If no tool calls, return the text response let response_text = response_data.content.iter() .filter_map(|block| { match block { ContentBlock::Text { text } => Some(text.clone()), _ => None, } }) .collect::<Vec<String>>() .join(""); Ok(response_text) })}// Update the call_anthropic_with_personality function to use toolspub async fn call_anthropic_with_personality(prompt: &str, personality: Option<&Personality>) -> anyhow::Result<String> { // Check if this is a direct ETH send command before passing to the AI model if prompt.to_lowercase().starts_with("send") && prompt.contains("ETH") { // This looks like an ETH send command, try to execute it directly let args = serde_json::json!({ "operation": "send", "raw_command": prompt }); return crate::tools::execute_tool("eth_wallet", &args).await; } // Otherwise, proceed with normal Claude processing call_anthropic_with_tools(prompt, personality, Vec::new()).await} Step 3: Update the Main Loop Load available tools and let Claude know they exist: // src/main.rsmod anthropic;mod personality;mod db;mod tools;use std::io::{self, Write};use dotenv::dotenv;use anthropic::call_anthropic_with_personality;use personality::load_personality;use db::{get_db_pool, save_message};use tools::get_available_tools;#[tokio::main]async fn main() -> anyhow::Result<()> { // Load environment variables dotenv().ok(); // Connect to database let db_pool = get_db_pool().await; // Load personality let personality = load_personality()?; // Load tools let tools = get_available_tools(); println!("Loaded tools: {}", tools.len()); println!("Welcome to Agent Friend! I'm {}, your {}.", personality.name, personality.description); println!("Type 'exit' to quit."); loop { print!("You: "); io::stdout().flush()?; let mut user_input = String::new(); io::stdin().read_line(&mut user_input)?; let user_input = user_input.trim(); if user_input.to_lowercase() == "exit" { break; } // Save user message to database if pool is available if let Some(pool) = &db_pool { save_message(pool, "user", user_input).await?; } // Get response from Claude with personality print!("{} is thinking...", personality.name); io::stdout().flush()?; let reply = call_anthropic_with_personality(user_input, Some(&personality)).await?; println!("\r"); // Clear the "thinking" message // Save assistant message to database if pool is available if let Some(pool) = &db_pool { save_message(pool, "assistant", &reply).await?; } println!("{}: {}", personality.name, reply); } Ok()} Example Run ✅ Now our agent isn’t just talking — it’s executing external functions. Next up, we’ll give those Ethereum stubs real power by adding blockchain integration. cargo run Example interaction with tools: Failed to connect to Postgres: pool timed out while waiting for an open connectionLoaded personality: Aero - AI research companionLoaded tools: [ { "name": "get_weather", "description": "Get the current weather for a given city" }, { "name": "get_time", "description": "Get the current time in a specific timezone or local time" }, { "name": "eth_wallet", "description": "Ethereum wallet operations: generate new wallet, check balance, or send ETH" }]Welcome to Agent Friend! I'm Aero, your AI research companion.Type 'exit' to quit.You: What's the weather in Tokyo?Aero is thinking...Aero: The weather in Tokyo is currently sunny and 72°F.Would you like me to provide any additional information about Tokyo's climate or weather patterns for your research? Ethereum Blockchain Integration So far, our agent can chat, remember, and use tools — but the Ethereum wallet tool is still a stub. Now it’s time to give it real on-chain powers. By the end of this section, your agent will be able to: 🔑 Generate new Ethereum wallets 💰 Check ETH balances 💸 Send ETH transactions (on Sepolia testnet by default) 📝 Parse natural language commands like “send 0.1 ETH from A to B” This makes the agent more than just an assistant — it becomes a Web3 agent that can act directly on-chain. Step 1: Add Ethereum Dependencies First, let’s add the necessary dependencies to Cargo.toml:

Add these to your existing dependenciesethers = { version = "2.0", features = ["legacy"] }regex = "1.10.2"

ethers-rs → the most popular Ethereum Rust library regex → for parsing natural language, send commands Step 2: Implement Ethereum Wallet Functions Replace the Ethereum stubs in tools.rs with real implementations: // src/tools.rs// Add these imports at the top of the fileuse ethers::{prelude::, utils::parse_ether};use regex::Regex;use std::str::FromStr;use std::time::Duration;// Replace the placeholder Ethereum functions with actual implementations// Generate a new Ethereum walletasync fn generate_eth_wallet() -> Result<String> { // Generate a random wallet let wallet = LocalWallet::new(&mut rand::thread_rng()); // Get the wallet address let address = wallet.address(); // Get the private key let private_key = wallet.signer().to_bytes().encode_hex::<String>(); Ok(format!("Generated new Ethereum wallet:\nAddress: {}\nPrivate Key: {}\n\nIMPORTANT: Keep your private key secure and never share it with anyone!", address, private_key))}// Check the balance of an Ethereum addressasync fn check_eth_balance(address: &str) -> Result<String> { // Validate the address if address.is_empty() { return Ok("Please provide an Ethereum address to check the balance.".to_string()); } // Parse the address let address = match Address::from_str(address) { Ok(addr) => addr, Err() => return Ok("Invalid Ethereum address format.".to_string()), }; // Get the RPC URL from environment variable or use a default let rpc_url = std::env::var("ETH_RPC_URL") .unwrap_or_else(|| "https://sepolia.gateway.tenderly.co".to_string()); // Create a provider let provider = Provider::<Http>::try_from(rpc_url)?; // Get the balance let balance = provider.get_balance(address, None).await?; // Convert to ETH let balance_eth = ethers::utils::format_ether(balance); Ok(format!("Balance of {}: {} ETH (on Sepolia testnet)", address, balance_eth))}// Send ETH from one address to anotherasync fn eth_send_eth(from_address: &str, to_address: &str, amount: &str, provided_private_key: Option<&str>) -> Result<String> { // Validate inputs if from_address.is_empty() || to_address.is_empty() || amount.is_empty() { return Ok("Please provide from address, to address, and amount.".to_string()); } // Parse addresses let to_address = match Address::from_str(to_address) { Ok(addr) => addr, Err() => return Ok("Invalid recipient Ethereum address format.".to_string()), }; // Parse amount let amount_wei = match parse_ether(amount) { Ok(wei) => wei, Err() => return Ok("Invalid ETH amount. Please provide a valid number.".to_string()), }; // Get private key let private_key = match provided_private_key { Some(key) => key.to_string(), None => { return Ok("Private key is required to send transactions. Please provide your private key.".to_string()); } }; // Create wallet from private key let wallet = match LocalWallet::from_str(&private_key) { Ok(wallet) => wallet, Err() => return Ok("Invalid private key format.".to_string()), }; // Verify the from address matches the wallet address if wallet.address().to_string().to_lowercase() != from_address.to_lowercase() { return Ok("The provided private key does not match the from address.".to_string()); } // Get the RPC URL from environment variable or use a default let rpc_url = std::env::var("ETH_RPC_URL") .unwrap_or_else(|| "https://sepolia.gateway.tenderly.co".to_string()); // Create a provider let provider = Provider::<Http>::try_from(rpc_url)?; // Create a client with the wallet let chain_id = 11155111; // Sepolia let client = SignerMiddleware::new(provider, wallet.with_chain_id(chain_id)); // Create the transaction let tx = TransactionRequest::new() .to(to_address) .value(amount_wei) .gas_price(client.get_gas_price().await?); // Estimate gas let gas_estimate = client.estimate_gas(&tx, None).await?; let tx = tx.gas(gas_estimate); // Send the transaction let pending_tx = client.send_transaction(tx, None).await?; // Wait for the transaction to be mined (with timeout) match tokio::time::timeout( Duration::from_secs(60), pending_tx.confirmations(1), ).await { Ok(Ok(receipt)) => { // Transaction was mined let tx_hash = receipt.transaction_hash; let block_number = receipt.block_number.unwrap_or_default(); Ok(format!("Successfully sent {} ETH from {} to {}\nTransaction Hash: {}\nBlock Number: {}\nExplorer Link: https://sepolia.etherscan.io/tx/{}", amount, from_address, to_address, tx_hash, block_number, tx_hash)) }, Ok(Err(e)) => { // Error while waiting for confirmation Ok(format!("Transaction sent but failed to confirm: {}", e)) }, Err(_) => { // Timeout Ok(format!("Transaction sent but timed out waiting for confirmation. Transaction hash: {}", pending_tx.tx_hash())) }, }}// Parse and execute ETH send command from natural languageasync fn parse_and_execute_eth_send_command(command: &str) -> Result<String> { // Define regex patterns for different command formats let patterns = [ // Pattern 1: send 0.1 ETH from 0x123 to 0x456 using private_key Regex::new(r"(?i)send\s+([0-9].?[0-9]+)\sETH\s+from\s+(0x[a-fA-F0-9]{40})\s+to\s+(0x[a-fA-F0-9]{40})\s+using\s+([0-9a-fA-F]+)").unwrap(), // Pattern 2: send 0.1 ETH to 0x456 from 0x123 using private_key Regex::new(r"(?i)send\s+([0-9].?[0-9]+)\sETH\s+to\s+(0x[a-fA-F0-9]{40})\s+from\s+(0x[a-fA-F0-9]{40})\s+using\s+([0-9a-fA-F]+)").unwrap(), ]; // Try to match each pattern for pattern in &patterns { if let Some(captures) = pattern.captures(command) { // Extract parameters based on the pattern let (amount, from_address, to_address, private_key) = if pattern.as_str().contains("from\s+.\s+to") { // Pattern 1 ( captures.get(1).map_or("", |m| m.as_str()), captures.get(2).map_or("", |m| m.as_str()), captures.get(3).map_or("", |m| m.as_str()), captures.get(4).map_or("", |m| m.as_str()), ) } else { // Pattern 2 ( captures.get(1).map_or("", |m| m.as_str()), captures.get(3).map_or("", |m| m.as_str()), captures.get(2).map_or("", |m| m.as_str()), captures.get(4).map_or("", |m| m.as_str()), ) }; // Execute the ETH send return eth_send_eth(from_address, to_address, amount, Some(private_key)).await; } } // If no pattern matches, return an error message Ok("Could not parse ETH send command. Please use the format: 'send 0.1 ETH from 0x123 to 0x456 using private_key'".to_string())} Step 3: Updating the .env.example File Update your .env.example file to include the Ethereum RPC URL: ANTHROPIC_API_KEY=your_api_key_hereDATABASE_URL=postgres://username:password@localhost/agent_friendETH_RPC_URL=https://sepolia.gateway.tenderly.co Step 4: Example Interaction Now you can interact with the Ethereum blockchain using your agent. Here are some example interactions: Generating a New Wallet You: Generate a new Ethereum walletAero: I'll generate a new Ethereum wallet for you. Let me do that now.Generated new Ethereum wallet:Address: 0x8f5b2b7A3aC99D52eE0B8B5AE37432E528e3E854Private Key: 7f5d33a6b4e9a4c3d8b1e2f1a0c9d8b7a6e5f4d3c2b1a0f9e8d7c6b5a4f3e2d1IMPORTANT: Keep your private key secure and never share it with anyone!This wallet is ready to use on the Ethereum network. Since we're working with the Sepolia testnet, you can get some test ETH from a Sepolia faucet to experiment with transactions.Would you like me to provide information about Sepolia faucets where you can get test ETH? Checking a Wallet Balance You: Check the balance of 0x8f5b2b7A3aC99D52eE0B8B5AE37432E528e3E854Aero: I'll check the balance of that Ethereum address on the Sepolia testnet.Balance of 0x8f5b2b7A3aC99D52eE0B8B5AE37432E528e3E854: 0.5 ETH (on Sepolia testnet)This shows you have 0.5 ETH on the Sepolia test network. Is there anything specific you'd like to do with these funds? Sending ETH Using Natural Language You: send 0.1 ETH from 0x8f5b2b7A3aC99D52eE0B8B5AE37432E528e3E854 to 0x742d35Cc6634C0532925a3b844Bc454e4438f44e using 7f5d33a6b4e9a4c3d8b1e2f1a0c9d8b7a6e5f4d3c2b1a0f9e8d7c6b5a4f3e2d1Successfully sent 0.1 ETH from 0x8f5b2b7A3aC99D52eE0B8B5AE37432E528e3E854 to 0x742d35Cc6634C0532925a3b844Bc454e4438f44eTransaction Hash: 0x3a1b2c3d4e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1bBlock Number: 4269420Explorer Link: https://sepolia.etherscan.io/tx/0x3a1b2c3d4e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b Conclusion In this blog series, we’ve built an AI agent from scratch in Rust, starting simple and adding power step by step: 🗣️ Basic chat with the Anthropic API 🎭 Custom personalities defined in JSON 🗂️ Persistent memory with PostgreSQL 🛠️ Tool integration for weather, time, and Ethereum ⛓️ On-chain actions with wallet generation, balance checks, and ETH transfers The result is a flexible AI + Web3 agent template you can extend however you want. Where to go from here? 🚀 Add more tools (NFT minting, smart contract interaction, price feeds) Build a web or mobile interface for your agent Experiment with multi-agent setups (agents talking to each other) Expand memory with vector databases or summarisation Support additional blockchains like Solana or Polkadot Rust’s safety and performance, combined with any AI model you prefer for reasoning, make this a powerful foundation for building the next generation of AI-native dApps. 🎉 Happy building! Whether you’re experimenting or deploying production systems, this project gives you a template for creating agents that don’t just talk but act 🚀 Building an AI Agent with Rust: From Basic Chat to Blockchain Integration was originally published in Coinmonks on Medium, where people are continuing the conversation by highlighting and responding to this story

Author: Medium
Urgent: Bunni DEX Hack Reveals Critical DeFi Security Flaws

Urgent: Bunni DEX Hack Reveals Critical DeFi Security Flaws

BitcoinWorld Urgent: Bunni DEX Hack Reveals Critical DeFi Security Flaws The cryptocurrency world is once again facing a stark reminder of its inherent risks, as news breaks about a significant security incident. The Bunni DEX hack has reportedly led to the loss of approximately $2.3 million across two major blockchain networks: UniChain and Ethereum. This alarming event serves as a critical wake-up call for everyone involved in decentralized finance (DeFi), emphasizing the constant need for vigilance and robust security measures. What Exactly Happened in the Bunni DEX Hack? According to initial reports from blockchain security firm BlockSecFalcon, the decentralized exchange (DEX) Bunni DEX experienced an apparent security breach. The firm quickly identified the compromise, which resulted in a substantial financial loss. While full details are still emerging, the hack’s impact spread across both UniChain and the widely used Ethereum network. Bunni DEX operates as a platform where users can trade cryptocurrencies directly with each other, without the need for a central intermediary. These platforms rely heavily on smart contracts to facilitate transactions, and any vulnerability in these contracts can be exploited by malicious actors. The exact method used in this particular Bunni DEX hack is currently under investigation, but such incidents often stem from complex exploits of smart contract code. Why Are DeFi Platforms Vulnerable to Attacks Like the Bunni DEX Hack? Decentralized finance, while offering unprecedented opportunities for financial freedom, also presents unique security challenges. The open-source nature of many DeFi protocols means their code is publicly viewable, which can be a double-edged sword. On one hand, it allows for community audits; on the other, it gives attackers ample time to scrutinize for weaknesses. Common vulnerabilities that lead to events like the Bunni DEX hack include: Smart Contract Bugs: Errors or oversights in the code can be exploited to drain funds. Flash Loan Attacks: These involve borrowing large amounts of assets, manipulating market prices, and repaying the loan within a single transaction, often exploiting price oracles. Front-Running: Attackers can see pending transactions and place their own orders to profit from the price movement. Private Key Compromises: Although less common for protocol-level hacks, compromised administrative keys can grant access to funds. These sophisticated attack vectors require deep technical knowledge to prevent, making robust security audits an absolute necessity for any DeFi project. Protecting Your Digital Assets: Lessons from the Bunni DEX Hack For users and project developers alike, the Bunni DEX hack underscores the critical importance of security. While developers must prioritize rigorous code audits and implement multi-layered security protocols, users also have a role to play in safeguarding their investments. Here are some actionable insights: Due Diligence: Always research a DeFi project thoroughly before investing. Look for audited smart contracts, experienced teams, and clear communication channels. Diversification: Avoid putting all your funds into a single project, no matter how promising it seems. Wallet Security: Use hardware wallets for significant holdings and be wary of connecting your wallet to unfamiliar or suspicious dApps. Stay Informed: Follow reputable blockchain security firms and news outlets for updates on potential vulnerabilities and hacks. The DeFi space is constantly evolving, and so are the methods used by attackers. Continuous education and adaptation are key to navigating this dynamic environment safely. The community’s collective effort in identifying and mitigating risks is vital for the long-term health and growth of decentralized finance. The incident involving the Bunni DEX hack serves as a potent reminder that even established platforms can fall victim to sophisticated attacks. As the industry matures, the focus on security infrastructure, rapid response protocols, and transparent communication will become paramount. This event should prompt both users and developers to re-evaluate their security postures and work towards a more resilient and secure DeFi ecosystem for everyone. Frequently Asked Questions (FAQs) What is Bunni DEX? Bunni DEX is a decentralized exchange that allows users to trade cryptocurrencies directly with each other without the need for a centralized intermediary, relying on smart contracts for transaction execution. How much money was lost in the Bunni DEX hack? Approximately $2.3 million in digital assets was reportedly stolen during the Bunni DEX hack, impacting both the UniChain and Ethereum networks. Are my funds safe on other decentralized exchanges (DEXs)? While the Bunni DEX hack highlights risks, many DEXs employ robust security measures, including regular audits. However, no platform is entirely risk-free. Always conduct your own research and exercise caution. What steps can I take to protect my cryptocurrency assets from hacks? To protect your assets, use hardware wallets, diversify your investments, research projects thoroughly (checking for audits), and be extremely careful about which dApps you connect your wallet to. Stay informed about security best practices. If you found this article insightful, please consider sharing it with your network to help raise awareness about crucial DeFi security issues. Your share can help others stay informed and vigilant in the evolving crypto landscape! To learn more about the latest crypto market trends, explore our article on key developments shaping Ethereum price action. This post Urgent: Bunni DEX Hack Reveals Critical DeFi Security Flaws first appeared on BitcoinWorld and is written by Editorial Team

Author: Coinstats
Ethereum Moves from Holesky to Hoodi for Future Protocol Testing After Failures

Ethereum Moves from Holesky to Hoodi for Future Protocol Testing After Failures

TLDR Holesky testnet will shut down after Fusaka upgrade; developers migrate to Hoodi. Holesky faced issues during Pectra testing, leading to the launch of Hoodi testnet. The Fusaka upgrade, scheduled for November, will improve rollup data availability. Developers encouraged to use Sepolia, Ephemery, or Hoodi for ongoing testing. Ethereum’s largest testnet, Holesky, is set to [...] The post Ethereum Moves from Holesky to Hoodi for Future Protocol Testing After Failures appeared first on CoinCentral.

Author: Coincentral
Starknet Outage: Unpacking the Critical 20-Minute Halt on Ethereum’s L2

Starknet Outage: Unpacking the Critical 20-Minute Halt on Ethereum’s L2

BitcoinWorld Starknet Outage: Unpacking the Critical 20-Minute Halt on Ethereum’s L2 The crypto world recently witnessed an unexpected event: a significant Starknet outage. This incident, which saw block production halt for a concerning 20 minutes, sent ripples of discussion across the community. For a network built on the promise of scalability and efficiency, such a pause naturally raises questions about stability and resilience. Let’s delve into what happened and what this brief but impactful interruption signifies for the future of Ethereum’s Layer 2 ecosystem. What Exactly Caused the Starknet Outage? According to reports from Wu Blockchain and observations on the Voyager explorer, the Starknet (STRK) network, a prominent ZK-rollup-based Ethereum Layer 2, experienced a service issue. This led to a complete halt in block production for approximately 20 minutes. While the exact root cause was not immediately detailed, such events in blockchain networks typically stem from various factors. Potential causes for a network halt can include: Software bugs: Unforeseen errors in the network’s code. Consensus issues: Disagreements among network validators on the next valid block. Infrastructure overload: Sudden spikes in transaction volume overwhelming the system. Security incidents: Though less common for a full halt, they are always a consideration. This particular Starknet outage quickly became a talking point, emphasizing the fragility even of advanced blockchain solutions. The Immediate Impact of the Starknet Outage A 20-minute halt might seem brief in the grand scheme of things, but in the fast-paced world of decentralized finance (DeFi) and blockchain transactions, it can feel like an eternity. During this period, users attempting to process transactions on Starknet would have experienced delays or outright failures. This directly impacts user experience and can lead to missed opportunities for traders or disruptions for dApp users. Moreover, such an event can temporarily erode user confidence. When a network like Starknet, designed to enhance Ethereum’s performance, faces an operational pause, it highlights the inherent challenges in maintaining continuous uptime for complex Layer 2 solutions. It serves as a stark reminder that even the most innovative technologies are not immune to technical glitches. Understanding the specifics of this Starknet outage is crucial for assessing its long-term implications. Learning from the Starknet Outage: Enhancing Resilience Every network incident, including this recent Starknet outage, offers valuable lessons for developers and the wider blockchain community. For Starknet, it’s an opportunity to thoroughly investigate the cause, implement robust fixes, and enhance their monitoring and response protocols. Transparency in communication during and after such events is also paramount for maintaining trust. For users and developers relying on Layer 2 solutions, this incident underscores the importance of: Diversification: Not putting all eggs in one basket; considering multiple Layer 2 options. Monitoring: Staying informed about network status and updates from official channels. Risk Assessment: Understanding the potential for downtime and building applications with resilience in mind. The incident prompts a broader discussion on the stress testing and emergency protocols that ZK-rollups employ to ensure continuous operation, even under unexpected conditions. This commitment to continuous improvement ultimately strengthens the entire ecosystem. The recent 20-minute Starknet outage was a moment of reflection for the Layer 2 landscape. While brief, it highlighted the critical need for robust infrastructure, transparent communication, and continuous improvement in the pursuit of decentralized scalability. As Starknet and other ZK-rollups continue to evolve, these experiences will undoubtedly contribute to building more resilient and reliable networks, ultimately benefiting all users of the Ethereum ecosystem. The journey to a truly seamless and scalable blockchain future is ongoing, and every challenge overcome makes the network stronger. Frequently Asked Questions (FAQs) Q1: What is Starknet? A1: Starknet is a ZK-rollup-based Ethereum Layer 2 network designed to scale Ethereum by processing transactions off-chain, thereby reducing costs and increasing throughput. Q2: What happened during the recent Starknet outage? A2: The Starknet network experienced a service issue, halting block production for approximately 20 minutes, which meant no new transactions could be processed. Q3: Was the Starknet outage a security breach? A3: No official reports indicate the Starknet outage was a security breach. It was likely caused by software bugs, consensus issues, or infrastructure problems. Q4: How does a network halt impact users? A4: Users face transaction delays or failures, leading to missed opportunities and a temporary dip in confidence regarding the network’s reliability and uptime. Was this article helpful in understanding the recent Starknet outage? Share your thoughts and spread awareness within the crypto community! Follow us on social media and share this article to keep others informed about critical developments in the blockchain space. To learn more about the latest crypto market trends, explore our article on key developments shaping Ethereum price action. This post Starknet Outage: Unpacking the Critical 20-Minute Halt on Ethereum’s L2 first appeared on BitcoinWorld and is written by Editorial Team

Author: Coinstats
Largest Testnet Holesky to Close After Fusaka Upgrade

Largest Testnet Holesky to Close After Fusaka Upgrade

The post Largest Testnet Holesky to Close After Fusaka Upgrade appeared on BitcoinEthereumNews.com. A fresh slate of Ethereum testnets is replacing Holesky, the once-massive staging ground now set for shutdown after two years of service. The wind-down will occur two weeks after the Fusaka upgrade is finalized later this year, at which point client and infrastructure teams will cease providing support. Fusaka is set to make Ethereum rollups cheaper and faster by spreading out the “data storage work” more evenly across validators. Holesky went live in 2023 to stress-test Ethereum’s proof-of-stake machinery at scale. It quickly became the largest public testnet, providing thousands of validators with a platform to trial upgrades before they were deployed on the mainnet. Major milestones, such as the Dencun and Pectra upgrades — which lowered transaction costs and upgraded validator efficiency, among other features — were run through Holesky first. However, cracks began to appear as the network aged. Holesky encountered “inactivity leaks” after Pectra’s activation in early 2025, a term referring to validators going offline in large numbers, which created a significant backlog for those attempting to exit. The result was months-long queues that made it impractical to test the full validator lifecycle. For developers needing fast feedback loops, Holesky had become more of a roadblock than a tool. That’s why Ethereum launched Hoodi in March 2025, a clean-slate testnet built to sidestep Holesky’s problems while carrying forward its role as the go-to environment for validator and staking provider testing. Alongside Hoodi, Sepolia continues to serve as the main testnet for dapps (decentralized apps) and smart contracts, while Ephemery offers quick-reset validator cycles every 28 days. Ether (ETH) was trading at $4,380 in Asian morning hours Tuesday, nearly flat over the past 24 hours. Source: https://www.coindesk.com/tech/2025/09/02/ethereum-to-close-its-largest-testnet-holesky-after-fusaka-upgrade

Author: BitcoinEthereumNews
Solana vs. Cardano: Which Is the Smarter Crypto Investment?

Solana vs. Cardano: Which Is the Smarter Crypto Investment?

The post Solana vs. Cardano: Which Is the Smarter Crypto Investment? appeared on BitcoinEthereumNews.com. Disclaimer: This content is a sponsored article. Bitcoinsistemi.com is not responsible for any damages or negativities that may arise from the above information or any product or service mentioned in the article. Bitcoinsistemi.com advises readers to do individual research about the company mentioned in the article and reminds them that all responsibility belongs to the individual. In the crypto world, few debates are as hot as Solana (SOL) versus Cardano (ADA). Both projects are positioned as strong challengers to Ethereum, aiming to solve the industry’s biggest hurdles: scalability, transaction speed, and cost. While both rely on Proof-of-Stake (PoS), their design philosophies and technical approaches could not be more different. This makes the question of which one is the smarter long-term play especially interesting for investors. Before diving deeper, it’s worth noting that analysts are also pointing to a third player quietly gaining traction. MAGACOIN FINANCE has started drawing attention from investors who see it as a rare early-stage opportunity, with experts predicting it could outperform both Solana and Cardano’s gains in 2025. A Tale of Two Philosophies Solana has built its brand on speed. Known as the “Speed Demon” of crypto, it leverages a unique Proof-of-History (PoH) mechanism alongside PoS, allowing the network to process up to 65,000 transactions per second in theory. Fees are tiny, often just a fraction of a cent, making Solana especially attractive for DeFi, NFTs, and Web3 projects where low cost and fast throughput matter. However, the network has faced its share of outages, raising concerns over stability. Cardano, in contrast, takes the “academic approach.” Its Ouroboros protocol is the first PoS mechanism backed by peer-reviewed research, designed for long-term security and sustainability. While transaction speeds are currently much lower than Solana’s, Cardano is steadily rolling out upgrades like Hydra, which could eventually scale to…

Author: BitcoinEthereumNews
WLFI Token: Urgent Update as Top Holders Offload Significant Portions

WLFI Token: Urgent Update as Top Holders Offload Significant Portions

BitcoinWorld WLFI Token: Urgent Update as Top Holders Offload Significant Portions In a significant development that has caught the attention of the crypto community, recent on-chain analysis reveals a notable trend among the largest holders of the WLFI token. This movement could signal a shift in market dynamics and is certainly worth a closer look for anyone invested in or following the project. What’s Happening with Top WLFI Token Holders? According to insights from the on-chain analyst ai_9684xtpa, a striking eight out of the top ten addresses holding WLFI tokens have made moves. These major players have either sold a portion or, in some cases, all of their holdings. This activity raises questions about the future trajectory of the token and the sentiment of its most influential investors. Significant Sell-Off: A majority of the top holders have reduced their exposure. Data-Driven Insights: This information comes directly from on-chain analysis, providing transparent, verifiable data. Market Impact: Such large-scale movements by whales often precede or influence market trends. It is crucial to note that not all top holders have divested. The analyst pointed out that the second and fifth largest holders have, as of the report, not yet moved their tokens. Nor have they deposited them onto exchanges, suggesting they might be holding firm or planning different strategies. Key Players in the WLFI Token Market: Who’s Selling What? Among the top addresses, moonmanifest.eth, the largest holder, has shown a mixed strategy. While they sold a portion of their assets, they continue to hold a substantial one billion WLFI tokens. This remaining stake is valued at approximately $230 million, based on a price of $0.2318 per token. This indicates a partial de-risking rather than a full exit. Another prominent address, convexcuck.eth, the sixth largest holder, took a more direct approach. They sold around $3.8 million worth of WLFI tokens. This transaction was executed through the over-the-counter (OTC) platform Whale Market, distributing the tokens to 36 different buyers. This method often allows for larger trades to occur without immediately impacting exchange prices. Why Would Major WLFI Token Holders Sell Now? The motivations behind such large-scale sales can be complex and varied. Understanding these potential reasons is vital for other investors trying to make sense of the market. Here are a few possibilities: Profit Taking: If these holders acquired WLFI at a much lower price, current market conditions might present an opportune moment to realize significant gains. Diversification: Large investors often rebalance their portfolios to mitigate risk by spreading investments across various assets. Project Outlook: While speculative, some holders might be reacting to internal developments, perceived risks, or a change in their long-term outlook for the WLFI token project. Liquidity Needs: Large entities or individuals may require substantial capital for other ventures, leading to the sale of their holdings. It is important for investors to conduct their own research and not solely rely on the actions of others. Market movements are influenced by a multitude of factors, and what is right for one whale might not be suitable for a retail investor. What Does This Mean for the Average WLFI Token Investor? For the broader community holding the WLFI token, these sales by top addresses can have several implications. Increased selling pressure, especially from large wallets, can lead to price volatility. However, it also means that tokens are being distributed to more addresses, potentially decentralizing ownership over time. This situation highlights the importance of staying informed through reliable on-chain analysis and market news. While the actions of whales can be influential, they are not always indicative of a project’s fundamental strength or weakness. Investors should consider: Market Sentiment: Observe how the market reacts to this news. Is there panic selling, or is the market absorbing the supply? Volume and Liquidity: Monitor trading volumes to see if new demand is emerging to meet the increased supply. Project Fundamentals: Re-evaluate the WLFI project’s roadmap, development progress, and community engagement, independent of these specific sales. In conclusion, the recent activity among the top WLFI token holders presents a fascinating case study in crypto market dynamics. While eight out of ten major addresses have offloaded portions of their holdings, the reasons are multifaceted, and the long-term impact remains to be seen. Vigilance, independent research, and a clear understanding of your own investment goals are paramount in navigating such market shifts. Frequently Asked Questions (FAQs) 1. What is the WLFI token? The WLFI token is a specific cryptocurrency asset within a larger blockchain ecosystem. Its exact utility and purpose depend on the project it is associated with, often involving governance, utility within a dApp, or a store of value. 2. Who is ai_9684xtpa? ai_9684xtpa is an on-chain analyst known for tracking and reporting significant movements of cryptocurrency assets by large holders (often referred to as ‘whales’) on various blockchain networks. 3. Why are on-chain analytics important for crypto investors? On-chain analytics provide transparent, real-time data about transactions, wallet movements, and network activity. This data can offer insights into market sentiment, potential whale movements, and the health of a blockchain project, helping investors make more informed decisions. 4. What does it mean for a top holder to ‘offload’ tokens? To ‘offload’ tokens means to sell a significant portion or all of one’s holdings. This can be done through exchanges or over-the-counter (OTC) deals, and often indicates a desire to take profits, diversify, or exit a position. 5. How does a partial sale by moonmanifest.eth impact the WLFI token? A partial sale by a major holder like moonmanifest.eth suggests a strategy of de-risking rather than a complete loss of faith in the project. While it adds selling pressure, the continued holding of a substantial amount indicates ongoing interest or belief in the token’s long-term potential. If you found this analysis of the WLFI token market insightful, please share it with your network! Your support helps us continue to provide timely and relevant crypto market updates. Stay informed, stay ahead! To learn more about the latest crypto market trends, explore our article on key developments shaping cryptocurrency price action. This post WLFI Token: Urgent Update as Top Holders Offload Significant Portions first appeared on BitcoinWorld and is written by Editorial Team

Author: Coinstats
From 1 Block at a Time to 15,000 Transactions per Second: BlockDAG Is Dominating 2025’s Top Crypto Coins List

From 1 Block at a Time to 15,000 Transactions per Second: BlockDAG Is Dominating 2025’s Top Crypto Coins List

For years, blockchain’s biggest bottleneck has been its own structure. Bitcoin and Ethereum were revolutionary, but they’re still bound by sequential chains, where one block must follow another. That works fine for slow settlement, but not for real-time apps, fast payments, or global-scale dApps. Enter BlockDAG, a network that ditches the queue for a DAG-based model, allowing parallel processing. It’s not just theory; it’s delivering up to 15,000 transactions per second. With a presale already pulling in nearly $389 million and more than 25 billion BDAG sold at a 2,900% ROI from batch one, this isn’t a promise. It’s a performance upgrade the entire industry’s watching.  Blockchain’s Bottleneck, And Why It Had to Go Traditional blockchains like Bitcoin and Ethereum are built on a single, unbreakable chain. One block follows another. One transaction is verified at a time. That structure ensures security but forces users into queues, especially during periods of high network activity. Gas fees spike. Confirmations slow down. And developers struggle to build apps that rely on real-time responses. This approach might have worked in blockchain’s early days, but it’s out of sync with what users expect today. A global DeFi application or an NFT-based gaming platform needs the kind of throughput that a linear chain simply can’t deliver. The world has outgrown serialized block validation. And that’s exactly where BlockDAG changes the game. Instead of lining up transactions, BlockDAG uses a Directed Acyclic Graph architecture to process multiple blocks at once. These “multi-parent” blocks reference several previous blocks, creating a network-like structure rather than a chain. The result? Massive gains in speed without compromising on decentralization or transparency. From Theory to Throughput, Why 15,000 TPS Matters Many projects talk about scalability, but BlockDAG shows it. The architecture supports between 2,000 and 15,000 transactions per second, real throughput, not testnet fantasy. That kind of speed puts it in the same class as traditional payment rails, making BlockDAG viable for actual commerce, real-time gaming, and high-frequency DeFi use cases. But it’s not just raw numbers, it’s usability. With this level of performance, developers can create dApps that don’t choke under load. Builders can launch marketplaces, games, and payment systems that scale with demand instead of collapsing from it. And for the average user, it means faster transactions, lower fees, and no more waiting in line for your block to confirm.  The tech behind this isn’t abstract. It’s built into every layer of the ecosystem, from the X1 mobile miner that rewards daily engagement, to the X10–X100 series miners that offer up to $100/day in rewards post-launch. All of it is grounded in a Proof-of-Work plus DAG model that’s engineered to handle scale, not just talk about it. From Presale to Mainnet, This Momentum Is Built on Math BlockDAG isn’t some vague vision waiting for funding. It’s already raising serious capital. With nearly $389 million raised in the presale and more than 25 billion BDAG coins sold, the numbers speak louder than any marketing pitch. The current batch, batch 30, is priced at just $0.03, offering a 2,900% return from Batch 1. It’s not a theory, it’s math.  This momentum is a direct result of BlockDAG’s architecture and its relevance to real-world use. Investors aren’t just buying into a coin. They’re buying into a network that’s built for utility, whether that’s payments that clear instantly, games that don’t lag, or DeFi platforms that don’t crash when the market moves fast.  And it’s not just hype. Exchange listings are already secured, 20 of them, to be exact. That means liquidity and access are guaranteed once the $600 million cap is hit. For presale participants, that’s a straight line from buy-in to trading, with no delay. And for the broader market, it signals that BlockDAG is arriving fully armed, not half-baked. Last Words BlockDAG isn’t rewriting blockchain theory; it’s rewriting blockchain practicality. By replacing sequential queues with parallel throughput, it takes crypto from slow and clunky to real-time and ready. With up to 15,000 TPS, a hybrid DAG + PoW infrastructure, and a full suite of tools already live, from the X1 mobile miner to the  BlockDAG Explorer, it’s clear this isn’t just another Layer 1. It’s a full-stack ecosystem. The presale numbers confirm what the tech delivers: $389 million raised, 25 billion coins sold, 2,900% ROI since batch one. For those still stuck in line behind outdated chains, the message is simple: it’s time to break free. Presale: https://purchase.blockdag.network Website: https://blockdag.network Telegram: https://t.me/blockDAGnetworkOfficial Discord: https://discord.gg/Q7BxghMVyu Disclaimer: This content is a sponsored post and is intended for informational purposes only. It was not written by 36crypto, does not reflect the views of 36crypto and is not a financial advice. Please do your research before engaging with the products.The post From 1 Block at a Time to 15,000 Transactions per Second: BlockDAG Is Dominating 2025’s Top Crypto Coins List appeared first on 36Crypto.

Author: Coinstats