Binance Square
LIVE
LIVE
Crypto De Nostradame
--38.2k views
Using #SOL as an example. It tested the $130-140 support level perfectly and bounced from there. Now what I expect is a retest, forming a higher low. During that move, some altcoins could make a new low. BTC could test the $57k-$59k level with a wick or also form a higher low. $BTC $SOL

Using #SOL as an example. It tested the $130-140 support level perfectly and bounced from there. Now what I expect is a retest, forming a higher low. During that move, some altcoins could make a new low. BTC could test the $57k-$59k level with a wick or also form a higher low.

$BTC $SOL

Отказ от ответственности: на платформе опубликованы материалы и мнения третьих лиц. Не является финансовой рекомендацией. Может содержать спонсируемый контент. См. Правила и условия.
0
Ответов: 1
Последние новости криптовалют
⚡️ Участвуйте в последних обсуждениях в криптомире
💬 Общайтесь с любимыми авторами
👍 Изучайте темы, которые вам интересны
Эл. почта/номер телефона
Связанные авторы

Другие публикации автора

--
Response from Google to OpenAI: Project Astra announced Six years ago, Google showed an artificial intelligence demo called Duplex at an I/O developer event and made an appointment with a barber. After many years, Google recently announced Project Astra. In the shared demo, Project Astra can see everything around it through a phone camera and answer your questions about it. Google's DeepMind CEO Demis Hassabis said in his keynote today that his team is working to develop universal AI agents that can help in daily life. Project Astra is a step towards this goal. In the shared demo, we see that the person holding the phone takes the device's camera and Project Astra to some parts of the office and asks questions. In the example shown, the user says aloud, “Tell me when you see something making a sound,” and the Gemini-powered AI responds, “I see a speaker making a sound.” The user then asks the AI ​​what it is that he is pointing at on the speaker and says, "This is the tweeter. It produces high-frequency sounds." gets the answer. According to its statement, Google underlines that this video was made in a single shot and in real time. Later in the video, Gemini is shown identifying and explaining pieces of code on a monitor, telling the user which neighborhood they are in based on the view from the window. The most impressive part is when the user asks, “Do you remember where I saw my glasses?” It happens when you ask the question. In the video, the artificial intelligence was not asked anything about glasses, in fact, there were no glasses in the scene that the user was showing on the phone camera at that moment. But despite this, Gemini says, "Yes, I remember. Your glasses are next to a red apple."
--

Последние новости

Подробнее
Структура веб-страницы
Cookie Preferences
Правила и условия платформы