Bitget App
Trade smarter
Buy cryptoMarketsTradeFuturesEarnWeb3SquareMore
Trade
Spot
Buy and sell crypto with ease
Margin
Amplify your capital and maximize fund efficiency
Onchain
Going Onchain, without going Onchain!
Convert
Zero fees, no slippage
Explore
Launchhub
Gain the edge early and start winning
Copy
Copy elite trader with one click
Bots
Simple, fast, and reliable AI trading bot
Trade
USDT-M Futures
Futures settled in USDT
USDC-M Futures
Futures settled in USDC
Coin-M Futures
Futures settled in cryptocurrencies
Explore
Futures guide
A beginner-to-advanced journey in futures trading
Futures promotions
Generous rewards await
Overview
A variety of products to grow your assets
Simple Earn
Deposit and withdraw anytime to earn flexible returns with zero risk
On-chain Earn
Earn profits daily without risking principal
Structured Earn
Robust financial innovation to navigate market swings
VIP and Wealth Management
Premium services for smart wealth management
Loans
Flexible borrowing with high fund security
Vitalik: Afraid of AGI and ASI, supports building intelligence-enhancing tools for humans rather than creating super intelligent life

Vitalik: Afraid of AGI and ASI, supports building intelligence-enhancing tools for humans rather than creating super intelligent life

Bitget2024/12/21 17:09

On December 22, in response to recent AGI-related product developments in the OpenAI and AI field, Ethereum founder Vitalik Buterin posted on platform X saying, "My definition of AGI (Artificial General Intelligence) is: AGI is a powerful enough artificial intelligence that if one day all humans suddenly disappeared and this AI was uploaded into a robot body, it would be able to continue civilization independently.

Obviously, this is a very difficult definition to measure, but I think this is exactly the core intuitive difference between 'the AI we are used to' and 'AGI' in many people's minds. It marks the transition from a tool that constantly relies on human input to a self-sufficient life form. ASI (Artificial Super Intelligence) is completely another matter - my definition is when humans no longer add value to productivity in the cycle (just like in chess games where we have only reached this point over the past decade.)

Yes, ASI scares me - even my defined AGI scares me because it brings obvious risks of losing control. I support focusing our work on building intelligent enhancement tools for humans rather than building super-intelligent life forms."

0

Disclaimer: The content of this article solely reflects the author's opinion and does not represent the platform in any capacity. This article is not intended to serve as a reference for making investment decisions.

PoolX: Locked for new tokens.
APR up to 10%. Always on, always get airdrop.
Lock now!