Stake with Nodeist

Blockchain Explorer Etherscan Launches ‘Code Reader’ With AI-integration

Blockchain Explorer Etherscan Launches ‘Code Reader’ With AI-integration

  • Code Reader, Etherscan’s beta app incorporates the ChatGPT API.
  • Etherscan Code Reader operates independently of the chatbot’s website.

Etherscan, a well-known Ethereum block explorer, has added ChatGPT to its arsenal of Ethereum blockchain analysis tools. Code Reader, Etherscan’s beta app that incorporates the ChatGPT API into its analytics platform, was released on Monday.

Etherscan stated:

“The Code Reader is a tool that leverages the power of AI to provide users with the ability to retrieve and interpret the source code of a specific contract address.”

A blockchain explorer, sometimes called a block explorer, is a web-based database that allows users to search for and see blockchain-related data and transactions.

Rising AI Integration



Etherscan is just another member of a rapidly expanding fraternity. Earlier last week, Alchemy, a prominent blockchain platform developer, released AlchemyAI, a ChatGPT-based application that has a GPT-4 plugin for navigating blockchains. Solana Labs released their proprietary ChatGPT plugin in May.

Etherscan Code Reader operates independently of the chatbot’s website, therefore it costs extra on top of a ChatGPT Plus membership and an OpenAI API key.

Code Readers users may use this to learn how the underlying contract interacts with decentralized apps, acquire a better knowledge of the contracts’ code through AI-generated explanations, and receive complete listings of smart contract operations connected to Ethereum data.

Etherscan cautions users not to accept the information provided by ChatGPT at face value, not to use the service for evidence or bug bounties, and to constantly verify the replies provided by the service.

This warning is in response to the frequent issue of AI chatbots providing inaccurate or misleading responses to user queries. Hallucinating describes this behavior pattern. When an AI produces false findings that are not supported by real-world evidence, this is called an AI hallucination.

Recommended For You:

Blockchain Analytics Firm Elliptic Integrates ChatGPT to Boost Efficiency
 
Up