Binance Square
OceanProtocol
43,645 views
23 Discussing
Hot
Latest
DoctorCrypt
--
How Artificial Intelligence is Shaping the Future of Digital AssetsArtificial Intelligence (AI) is rapidly transforming the cryptocurrency space, playing a pivotal role in shaping the future of digital assets. 🔹AI-driven technologies are enhancing various aspects of crypto, from improving security to optimizing trading strategies. Machine learning models can analyze vast amounts of data to predict market trends, enabling more accurate price forecasting and algorithmic trading. 🔹Coins like Singularity #NET are #AGIX utilizing AI to create decentralized networks where anyone can access AI services, pushing the boundaries of what blockchain and AI can achieve together. 🔹AI is also boosting security in the crypto world. With increasing concerns over hacks and fraud, AI-based solutions are being integrated to detect suspicious activity in real time, ensuring the safety of digital assets. For example, Fetch.ai $FET uses AI to create autonomous economic agents that optimize network efficiency and detect fraud in the ecosystem. 🔹Additionally, AI is improving scalability and reducing transaction costs, crucial for mass adoption. Projects like#OceanProtocol are working on incorporating AI with data marketplaces to enhance access to AI-driven insights for businesses, helping improve the efficiency and reliability of smart contracts and decentralized finance (DeFi) systems. 🔹As AI continues to evolve, it holds immense potential to reshape not only how cryptocurrencies are traded but also how blockchain technology is integrated into different sectors. #ArtificialInteligence #AI #altcoins #CPIHighestSinceJune

How Artificial Intelligence is Shaping the Future of Digital Assets

Artificial Intelligence (AI) is rapidly transforming the cryptocurrency space, playing a pivotal role in shaping the future of digital assets.
🔹AI-driven technologies are enhancing various aspects of crypto, from improving security to optimizing trading strategies. Machine learning models can analyze vast amounts of data to predict market trends, enabling more accurate price forecasting and algorithmic trading.
🔹Coins like Singularity #NET are #AGIX utilizing AI to create decentralized networks where anyone can access AI services, pushing the boundaries of what blockchain and AI can achieve together.
🔹AI is also boosting security in the crypto world. With increasing concerns over hacks and fraud, AI-based solutions are being integrated to detect suspicious activity in real time, ensuring the safety of digital assets. For example, Fetch.ai $FET uses AI to create autonomous economic agents that optimize network efficiency and detect fraud in the ecosystem.
🔹Additionally, AI is improving scalability and reducing transaction costs, crucial for mass adoption. Projects like#OceanProtocol are working on incorporating AI with data marketplaces to enhance access to AI-driven insights for businesses, helping improve the efficiency and reliability of smart contracts and decentralized finance (DeFi) systems.
🔹As AI continues to evolve, it holds immense potential to reshape not only how cryptocurrencies are traded but also how blockchain technology is integrated into different sectors.
#ArtificialInteligence #AI #altcoins #CPIHighestSinceJune
--
Bullish
What you need to know about #Oceanprotocol Ocean Protocol is a blockchain-based platform designed to facilitate secure and privacy-preserving data sharing. Unlike meme coins, which are often created for humor or speculative purposes, Ocean Protocol has a well-defined use case and aims to create a decentralized data economy. It leverages smart contracts to tokenize data, allowing data owners to monetize their datasets while ensuring privacy . The OCEAN token is integral to the platform, enabling transactions within the ecosystem, including the purchase and sale of data tokens, participation in governance, and staking to provide liquidity in the Ocean Market #oceanprotocol #BNBHODLer #Solana_Blockchain
What you need to know about #Oceanprotocol

Ocean Protocol is a blockchain-based platform designed to facilitate secure and privacy-preserving data sharing. Unlike meme coins, which are often created for humor or speculative purposes, Ocean Protocol has a well-defined use case and aims to create a decentralized data economy. It leverages smart contracts to tokenize data, allowing data owners to monetize their datasets while ensuring privacy .

The OCEAN token is integral to the platform, enabling transactions within the ecosystem, including the purchase and sale of data tokens, participation in governance, and staking to provide liquidity in the Ocean Market

#oceanprotocol #BNBHODLer #Solana_Blockchain
See original
This Smart Coin Is Getting Ready to Burn: Analysts Predict a Rebound!The Fetch.ai smart coin project plans to burn 5 million tokens on January 10, 2025. By doing so, it aims to reduce supply and increase demand. The recent enthusiasm for smart coins has also fueled speculation about a possible FET rebound. Meanwhile, the price of $FET has risen by 2 percent in the past 24 hours. Analysts are predicting a possible rebound towards $3.

This Smart Coin Is Getting Ready to Burn: Analysts Predict a Rebound!

The Fetch.ai smart coin project plans to burn 5 million tokens on January 10, 2025. By doing so, it aims to reduce supply and increase demand. The recent enthusiasm for smart coins has also fueled speculation about a possible FET rebound. Meanwhile, the price of $FET has risen by 2 percent in the past 24 hours. Analysts are predicting a possible rebound towards $3.
See original
Ocean Protocol will increase rewards on March 14 Ocean Protocol has announced that the reward for collecting Ocean data will be doubled to 300,000 OCEAN per week starting March 14th. The rewards are directly proportional to the length of the lockup period, meaning that the longer OCEAN tokens are locked, the higher the rewards. More detailed information can be found in the official OCEAN tweet Ocean Protocol is a blockchain-based ecosystem that aims to share and monetize data. The platform allows users to sell and buy data without disclosing the source, giving users the ability to manage and sell data without the risk of privacy violations. #OCEAN #oceanprotocol $OCEAN
Ocean Protocol will increase rewards on March 14

Ocean Protocol has announced that the reward for collecting Ocean data will be doubled to 300,000 OCEAN per week starting March 14th. The rewards are directly proportional to the length of the lockup period, meaning that the longer OCEAN tokens are locked, the higher the rewards.

More detailed information can be found in the official OCEAN tweet

Ocean Protocol is a blockchain-based ecosystem that aims to share and monetize data. The platform allows users to sell and buy data without disclosing the source, giving users the ability to manage and sell data without the risk of privacy violations.

#OCEAN #oceanprotocol
$OCEAN
Ocean Predictoor statsOcean Predictoor daily volume is doubling every 18 days It's now at $130K / day! That's $3.9M monthly, or $47.5M annually -- ignoring future growth. It's grown by 50x since November. What will the future hold? What is Ocean Predictoor? Ocean Predictoor is an on-chain, privacy-enabled, AI-powered application and stack that provides prediction feeds, which are streams of predictions for a given time series, such as the future price of cryptocurrencies like #ETH and #BTC🔥🔥 It operates by allowing "Predictoor" agents to submit individual predictions and stake on them, with the aggregated predictions being sold to trader agents who use them to inform their #Trading decisions. Predictoor is built on the #oceanprotocol Protocol stack and uses the Oasis Sapphire privacy-preserving #evm chain to keep predictions private unless paid for. The initial dapp is live at predictoor.ai and is designed for up/down predictions of cryptocurrency prices. You can check this dapp here - www.predictoor.ai

Ocean Predictoor stats

Ocean Predictoor daily volume is doubling every 18 days
It's now at $130K / day!
That's $3.9M monthly, or $47.5M annually -- ignoring future growth.
It's grown by 50x since November. What will the future hold?

What is Ocean Predictoor?
Ocean Predictoor is an on-chain, privacy-enabled, AI-powered application and stack that provides prediction feeds, which are streams of predictions for a given time series, such as the future price of cryptocurrencies like #ETH and #BTC🔥🔥 It operates by allowing "Predictoor" agents to submit individual predictions and stake on them, with the aggregated predictions being sold to trader agents who use them to inform their #Trading decisions. Predictoor is built on the #oceanprotocol Protocol stack and uses the Oasis Sapphire privacy-preserving #evm chain to keep predictions private unless paid for. The initial dapp is live at predictoor.ai and is designed for up/down predictions of cryptocurrency prices.
You can check this dapp here - www.predictoor.ai
AI-Powered Bots in Ocean Predictoor Get a UX Upgrade: CLI & YAMLThe pdr-backend v0.2 release has command-line interface & YAML file to set parameters, to run bots more easily Summary With Predictoor, you can run #AI-powered prediction bots or trading bots on #crypto price feeds to earn $. The interface to use predictoor bots & trader bots just got a lot simpler, via a CLI and using a YAML file for parameters. It also refactors backend code such to that we can do more powerful experiments around making $. 1. Intro: Predictoor & Bots #oceanprotocol Predictoor provides on-chain “prediction feeds” on whether #ETH #BTC🔥🔥 etc will rise in the next 5 min or 60 min. “Predictoors” submit predictions and stake on them; predictions are aggregated and sold to traders as alpha. Predictoor runs on OassisSapphire, the only confidential EVM chain in production. We launched Predictoor and its Data Farming incentives in September & November 2023, respectively. The pdr-backend GitHub repo has the Python code for all bots: Predictoor bots, Trader bots, and support bots (submitting true values, buying on behalf of DF, etc). As a predictoor, you run a predictoor bot with the help of a predictoor bot README in the pdr-backend GitHub repo. It takes 1–2 h to go through, including getting OCEAN & ROSE in Sapphire. The repo provides starting-point predictoor bots, which gather historical CEX price data and build AI/ML models. You can gain your own edge — to earn more $ — by changing the bot as you please: more data, better feature vectors, different modeling approaches, and more. Similarly, as a trader, you can run a trader bot with the help of a trader bot README. The repo provides starting-point trader bots, which use Predictoor price feeds as alpha for trading. You can gain your own edge — to earn more $ — by changing the bot as you please for more sophisticated trading strategies. Predictoor has promising traction, with 1.86M transactions and $1.86M in volume in the previous 30d [Ref DappRadar] [1]. Our main internal goal overall is to make $ trading, and then take those learnings to the community in the form of product updates, and related communications. Towards that, we’ve been eating our own dogfood: running our own predictoor bots & trader bots, and improving things based on our own experience. Most of these improvements come to Predictoor’s backend: the pdr-backend repo. We’ve evolved it a lot lately! Where it mandates the first big release since launch (yet still pre-v1). That’s what this blog post describes. The rest of this post is organized as follows. Section 2 describes the prior release (pdr-backend v0.1), and section 3 its challenges. Section 4 describes the new release (pdr-backend v0.2), focusing on its key features of CLI and YAML file, which help usability in running bots. Section 5 describes how v0.2 addresses the challenges of v0.1. Section 6 concludes. 2. About pdr-backend v0.1 Flows We released pdr-backend when we launched Predictoor in September 2023, and have been continually improving it since then: fixing bugs, reducing onboarding friction, and adding more capabilities (eg simulation flow). The first official release was v0.1.0 on November 20, 2023; with subsequent v0.1.x releases. It is licensed under Apache V2, a highly permissive open-source license. In the last v0.1 predictoor bot README, the flow had you do simulation, then run a bot on testnet, then run a bot on mainnet. Let’s elaborate. Simulation. You’d start simulation with a call like: python pdr_backend/simulation/runtrade.py. It grabs historical data, builds models, predicts, does simulated trades, then repeats, all on historical data. It logs and plots progress in real time. It would run according to default settings: what price feeds to use for AI model inputs, how much training data, etc. Those settings were hardcoded in the runtrade.py script. To change settings, you’d have to change the script itself, or support code. Run a bot on testnet. First, you’d specify envvars via the terminal: your private key, envvars for network (e.g. RPC_URL), and envvars to specify feeds (PAIR_FILTER,TIMEFRAME_FILTER, SOURCE_FILTER). Then you’d run the bot with a call like:python pdr_backend/predictoor/main.py 3. It would run according to default settings. The 3meant predictoor approach #3: dynamic model building. To change predictoor settings, you’d have to change code. Run a bot on mainnet. This was like testnet, except specifying different envvars for network and perhaps private key. Any further configurations, such as what CEX data sources to use in modeling, would be hardcoded in the script. To use different values, you’d have to modify those in your local copy of the code. The last v0.1 trader bot README had a similar flow to the v0.1 predictoor bot README. 3. Challenges in pdr-backend v0.1 Flows We were — and are — proud of the v0.1 predictoor bot & trader bot flows. We’d streamlined them fairly well: one could get going quickly, and accomplish what they needed to. To go further and modify parameters, one would have to jump into Python code. At first glance this might have thought this a problem; however target users (and actual users) are data scientists or developers, who have no trouble modifying code. Yet there were a few issues. First, it was annoying to manually change code to change parameters. We could have written higher-level code that looped, and modified the parameters code at each loop iteration; however code that changes code is error-prone and can be dangerous.Trader bots and predictoor bots had the same issue, and worse: the py code for parameter changes was scattered in a few places. Even if the scattering was fixed, the core issue would remain. Second, envvars didn’t have enough fidelity, and adding more would have led to an unusably high number of envvars. Recall that we used envvars to specify feeds (PAIR_FILTER, etc). This wasn’t enough detail for all our needs. For example, in running a predictoor bot, one couldn’t use envvars to specify the model output feed (what feed to predict) and model input price feeds, let alone non-price feeds for model inputs.And, putting it into envvars would be sloppy and error-prone; if we weren’t careful, we’d have a crazy number of envvars. Third, a messy CLI was organically emerging. Recall, one would run a predictoor bot with a custom call directly to the script, such as:python pdr_backend/predictoor/main.py 3, where 3meant approach 3. Similar for simulation or trader flows.Support for CLI-level parameters was pretty basic, only lightly tested, and was implemented on a per-script basis. Then, from our own runs of predictoor bots we were starting to do basic analytics, and a new ./scripts/directory emerged, with each script having its own custom CLI call. Things were getting messier yet. Finally, we wanted to extend pdr-backend functionality, and doing it in v0.1 code would explode complexity. We have big plans for our “make $” experiments, and for these, we saw the need to extend functionality by a lot.We wanted to build out a proper analytics app. We wanted to professionalize and operationalize the data pipeline, for use by simulation, the bots, and the analytics app.We wanted to extend simulation, into a flow that supported experiments on realtime data and with the possibility of live trading. Doing this would have means even more parameters and flows; if we kept the v0.1 parameter-setting and messy CLI then complexity would become unwieldy. We needed a cleaner base before we could proceed. 4. Introducing pdr-backend v0.2 We’re pleased to announce the release of pdr-backend v0.2. It solves the issues above 🎉 via a good CLI, and a YAML file to set parameters. It’s live now in the pdr-backend repo. The rest of this section describes the heavily-updated CLI, the YAML file, and changes to the pdr-backend architecture for a good data pipeline and analytics. 4.1 Updated CLI You get started with Predictoor like before: Then, you can type pdr to see the interface at the command-line: There are commands to run experiments / simulation (pdr xpmt), predictoor bot (pdr predictoor), trader bot (pdr trader), and for people running predictoor bots to claim rewards (pdr claim_OCEAN, pdr claim_ROSE). There’s a new command to fill the data lake (pdr lake), and several new analytics-related commands (pdr get_predictoors_info , …, pdr check_network ). Remaining commands are typically for use by the core team. To get help for a given command, just type the command without any argument values. 4.2 New: YAML file The default file is ppss.yaml. Most CLI commands take PPSS_FILE (YAML file) as an input argument. Therefore users can make their own copy from the default ppss.yaml, and modify at will. The YAML file holds most parameters; the CLI specifies which YAML file and network, and sometimes commonly-updated parameters. To minimize confusion, there are no envvars. All parameters are in the YAML file or the CLI. One exception: PRIVATE_KEY envvar is retained because putting it in a file would have reduced security. The YAML file has a sub-section for each bot: a predictoor_ss section, a trader_ss section, etc. The web3_pp section holds info for all networks. Below is an an example of the predictoor_ss section in the YAML file. Note how it specifies a feed to predict (predict_feed), as well as input feeds for AI modeling (aimodel_ss.input_feeds). Most CLI commands take NETWORK as an input argument. The YAML file holds RPC_URL and other network parameters for each network. Combining this, the NETWORK CLI argument selects from them. Therefore to wants to use a different network (e.g. testnet → mainnet), then one only needs to change the network name in the CLI. Compare this to v0.1 where several envvars needed changing. A bonus: the new setup allows convenient storage of many different network configurations (in the YAML file). When the whole YAML file is read, it creates a PPSS object. That object has attributes corresponding to each bot: a predictoor_ss object (of class PredictoorSS), a trader_ss object (of class TraderSS), etc. It also holds network info in its web3_pp object (of class Web3PP). 4.3 New: Good data pipeline We refined pdr-backend architecture to have a proper data pipeline, in new directory /lake/. It’s centered around a data lake with tiers from raw → refined. We’ve moved from storing raw price data as csv files, to parquet files, because parquet supports querying without needing to have a special database layer on top (!), among other benefits. In conjunction, we’ve moved from Pandas dataframes to Polars dataframes, because Polars scales better and plays well with parquet. (We are already seeing intensive data management and expect our data needs to grow by several orders of magnitude.) 4.4 New: Space to grow analytics We’ve also updated pdr-backend analytics support, in the new directory /analytics/ . First, what used to be ad-hoc scripts for analytics tools now has proper CLI support:pdr get_predictoors_info , …, pdr check_network. These analytics tools now use data from the lake, and continue to be evolved. Furthermore, we are building towards a more powerful analytics app that uses python-style plotting in the browser, via streamlit. 5. How pdr-backend v0.2 Fixes v0.1 Issues Here’s how v0.2 fixes each of the four issues raised above. Issue: Annoying to manually change code to change parametersv0.2 fix: use YAML file & CLI for all parameters. The YAML file holds most parameters; the CLI specifies which YAML file and network, and sometimes commonly-updated parameters. The YAML file holds parameters that were previously envvars, or somewhere in code. Here’s the default YAML file.Issue: envvars didn’t have enough fidelityv0.2 fix: use YAML file & CLI for all parameters. In the YAML file, each bot gets its own subsection, including which feeds to work with. The YAML has far more fidelity because it also includes variables that were previously in code.Issue: a messy CLI was organically emergingv0.2 fix: now we have a clean CLI. Previous calls to scripts for simulation, predictoor bot, trader bot, and various analytics are all now folded into the CLI. The CLI is implemented in new directory /cli/; its core modules cli_arguments.py and cli_module.py use argparse, the best-practices CLI library for Python. The CLI has unit tests and system tests.Issue: we wanted to extend pdr-backend functionality, and doing it in v0.1 code would explode complexity.v0.2 fix: YAML & clean CLI give a less-complex, more flexible foundation to build from. And we’re now nicely along in our progress: as we were building v0.2, we have also refined its architecture to have a proper data pipeline (in /lake/), the beginnings of a more powerful analytics app (in /analytics/), and are about to upgrade the simulation engine for more flexible and powerful experiments. 6. Conclusion With Ocean Predictoor, you can run AI-powered prediction bots or trading bots on crypto price feeds to earn $. With pdr-backend v0.2, the interface to use predictoor bots & trader bots just got a lot simpler, via a CLI and using a YAML file for parameters. It also refactors backend code such to that we can do more powerful experiments around making $. Get started at the pdr-backend’s https://github.com/oceanprotocol/pdr-backend. Notes [1] Two 1.86M values is a coincidence. Usually the values aren’t identical, though typically within 0.5x — 2x of each other About Ocean Protocol Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data. Follow Ocean on https://twitter.com/oceanprotocol to keep up to date. Chat directly with the Ocean community on https://discord.gg/kwWmXxwBDY. Or, track Ocean progress directly on https://github.com/oceanprotocol.

AI-Powered Bots in Ocean Predictoor Get a UX Upgrade: CLI & YAML

The pdr-backend v0.2 release has command-line interface & YAML file to set parameters, to run bots more easily

Summary
With Predictoor, you can run #AI-powered prediction bots or trading bots on #crypto price feeds to earn $. The interface to use predictoor bots & trader bots just got a lot simpler, via a CLI and using a YAML file for parameters. It also refactors backend code such to that we can do more powerful experiments around making $.
1. Intro: Predictoor & Bots
#oceanprotocol Predictoor provides on-chain “prediction feeds” on whether #ETH #BTC🔥🔥 etc will rise in the next 5 min or 60 min. “Predictoors” submit predictions and stake on them; predictions are aggregated and sold to traders as alpha. Predictoor runs on OassisSapphire, the only confidential EVM chain in production. We launched Predictoor and its Data Farming incentives in September & November 2023, respectively.

The pdr-backend GitHub repo has the Python code for all bots: Predictoor bots, Trader bots, and support bots (submitting true values, buying on behalf of DF, etc).
As a predictoor, you run a predictoor bot with the help of a predictoor bot README in the pdr-backend GitHub repo. It takes 1–2 h to go through, including getting OCEAN & ROSE in Sapphire. The repo provides starting-point predictoor bots, which gather historical CEX price data and build AI/ML models. You can gain your own edge — to earn more $ — by changing the bot as you please: more data, better feature vectors, different modeling approaches, and more.
Similarly, as a trader, you can run a trader bot with the help of a trader bot README. The repo provides starting-point trader bots, which use Predictoor price feeds as alpha for trading. You can gain your own edge — to earn more $ — by changing the bot as you please for more sophisticated trading strategies.
Predictoor has promising traction, with 1.86M transactions and $1.86M in volume in the previous 30d [Ref DappRadar] [1].

Our main internal goal overall is to make $ trading, and then take those learnings to the community in the form of product updates, and related communications. Towards that, we’ve been eating our own dogfood: running our own predictoor bots & trader bots, and improving things based on our own experience. Most of these improvements come to Predictoor’s backend: the pdr-backend repo.
We’ve evolved it a lot lately! Where it mandates the first big release since launch (yet still pre-v1). That’s what this blog post describes.
The rest of this post is organized as follows. Section 2 describes the prior release (pdr-backend v0.1), and section 3 its challenges. Section 4 describes the new release (pdr-backend v0.2), focusing on its key features of CLI and YAML file, which help usability in running bots. Section 5 describes how v0.2 addresses the challenges of v0.1. Section 6 concludes.
2. About pdr-backend v0.1 Flows
We released pdr-backend when we launched Predictoor in September 2023, and have been continually improving it since then: fixing bugs, reducing onboarding friction, and adding more capabilities (eg simulation flow).
The first official release was v0.1.0 on November 20, 2023; with subsequent v0.1.x releases. It is licensed under Apache V2, a highly permissive open-source license.
In the last v0.1 predictoor bot README, the flow had you do simulation, then run a bot on testnet, then run a bot on mainnet. Let’s elaborate.
Simulation. You’d start simulation with a call like: python pdr_backend/simulation/runtrade.py. It grabs historical data, builds models, predicts, does simulated trades, then repeats, all on historical data. It logs and plots progress in real time. It would run according to default settings: what price feeds to use for AI model inputs, how much training data, etc. Those settings were hardcoded in the runtrade.py script. To change settings, you’d have to change the script itself, or support code.

Run a bot on testnet. First, you’d specify envvars via the terminal: your private key, envvars for network (e.g. RPC_URL), and envvars to specify feeds (PAIR_FILTER,TIMEFRAME_FILTER, SOURCE_FILTER). Then you’d run the bot with a call like:python pdr_backend/predictoor/main.py 3. It would run according to default settings. The 3meant predictoor approach #3: dynamic model building. To change predictoor settings, you’d have to change code.
Run a bot on mainnet. This was like testnet, except specifying different envvars for network and perhaps private key.
Any further configurations, such as what CEX data sources to use in modeling, would be hardcoded in the script. To use different values, you’d have to modify those in your local copy of the code.
The last v0.1 trader bot README had a similar flow to the v0.1 predictoor bot README.
3. Challenges in pdr-backend v0.1 Flows
We were — and are — proud of the v0.1 predictoor bot & trader bot flows. We’d streamlined them fairly well: one could get going quickly, and accomplish what they needed to. To go further and modify parameters, one would have to jump into Python code. At first glance this might have thought this a problem; however target users (and actual users) are data scientists or developers, who have no trouble modifying code.
Yet there were a few issues. First, it was annoying to manually change code to change parameters.
We could have written higher-level code that looped, and modified the parameters code at each loop iteration; however code that changes code is error-prone and can be dangerous.Trader bots and predictoor bots had the same issue, and worse: the py code for parameter changes was scattered in a few places. Even if the scattering was fixed, the core issue would remain.
Second, envvars didn’t have enough fidelity, and adding more would have led to an unusably high number of envvars.
Recall that we used envvars to specify feeds (PAIR_FILTER, etc). This wasn’t enough detail for all our needs. For example, in running a predictoor bot, one couldn’t use envvars to specify the model output feed (what feed to predict) and model input price feeds, let alone non-price feeds for model inputs.And, putting it into envvars would be sloppy and error-prone; if we weren’t careful, we’d have a crazy number of envvars.
Third, a messy CLI was organically emerging.
Recall, one would run a predictoor bot with a custom call directly to the script, such as:python pdr_backend/predictoor/main.py 3, where 3meant approach 3. Similar for simulation or trader flows.Support for CLI-level parameters was pretty basic, only lightly tested, and was implemented on a per-script basis. Then, from our own runs of predictoor bots we were starting to do basic analytics, and a new ./scripts/directory emerged, with each script having its own custom CLI call. Things were getting messier yet.
Finally, we wanted to extend pdr-backend functionality, and doing it in v0.1 code would explode complexity.
We have big plans for our “make $” experiments, and for these, we saw the need to extend functionality by a lot.We wanted to build out a proper analytics app. We wanted to professionalize and operationalize the data pipeline, for use by simulation, the bots, and the analytics app.We wanted to extend simulation, into a flow that supported experiments on realtime data and with the possibility of live trading. Doing this would have means even more parameters and flows; if we kept the v0.1 parameter-setting and messy CLI then complexity would become unwieldy. We needed a cleaner base before we could proceed.
4. Introducing pdr-backend v0.2
We’re pleased to announce the release of pdr-backend v0.2. It solves the issues above 🎉 via a good CLI, and a YAML file to set parameters. It’s live now in the pdr-backend repo.
The rest of this section describes the heavily-updated CLI, the YAML file, and changes to the pdr-backend architecture for a good data pipeline and analytics.
4.1 Updated CLI
You get started with Predictoor like before:

Then, you can type pdr to see the interface at the command-line:

There are commands to run experiments / simulation (pdr xpmt), predictoor bot (pdr predictoor), trader bot (pdr trader), and for people running predictoor bots to claim rewards (pdr claim_OCEAN, pdr claim_ROSE).
There’s a new command to fill the data lake (pdr lake), and several new analytics-related commands (pdr get_predictoors_info , …, pdr check_network ). Remaining commands are typically for use by the core team.
To get help for a given command, just type the command without any argument values.
4.2 New: YAML file
The default file is ppss.yaml. Most CLI commands take PPSS_FILE (YAML file) as an input argument. Therefore users can make their own copy from the default ppss.yaml, and modify at will.
The YAML file holds most parameters; the CLI specifies which YAML file and network, and sometimes commonly-updated parameters.
To minimize confusion, there are no envvars. All parameters are in the YAML file or the CLI. One exception: PRIVATE_KEY envvar is retained because putting it in a file would have reduced security.
The YAML file has a sub-section for each bot: a predictoor_ss section, a trader_ss section, etc. The web3_pp section holds info for all networks.
Below is an an example of the predictoor_ss section in the YAML file. Note how it specifies a feed to predict (predict_feed), as well as input feeds for AI modeling (aimodel_ss.input_feeds).
Most CLI commands take NETWORK as an input argument. The YAML file holds RPC_URL and other network parameters for each network. Combining this, the NETWORK CLI argument selects from them. Therefore to wants to use a different network (e.g. testnet → mainnet), then one only needs to change the network name in the CLI. Compare this to v0.1 where several envvars needed changing. A bonus: the new setup allows convenient storage of many different network configurations (in the YAML file).
When the whole YAML file is read, it creates a PPSS object. That object has attributes corresponding to each bot: a predictoor_ss object (of class PredictoorSS), a trader_ss object (of class TraderSS), etc. It also holds network info in its web3_pp object (of class Web3PP).
4.3 New: Good data pipeline
We refined pdr-backend architecture to have a proper data pipeline, in new directory /lake/. It’s centered around a data lake with tiers from raw → refined. We’ve moved from storing raw price data as csv files, to parquet files, because parquet supports querying without needing to have a special database layer on top (!), among other benefits.
In conjunction, we’ve moved from Pandas dataframes to Polars dataframes, because Polars scales better and plays well with parquet. (We are already seeing intensive data management and expect our data needs to grow by several orders of magnitude.)
4.4 New: Space to grow analytics
We’ve also updated pdr-backend analytics support, in the new directory /analytics/ . First, what used to be ad-hoc scripts for analytics tools now has proper CLI support:pdr get_predictoors_info , …, pdr check_network. These analytics tools now use data from the lake, and continue to be evolved. Furthermore, we are building towards a more powerful analytics app that uses python-style plotting in the browser, via streamlit.
5. How pdr-backend v0.2 Fixes v0.1 Issues
Here’s how v0.2 fixes each of the four issues raised above.
Issue: Annoying to manually change code to change parametersv0.2 fix: use YAML file & CLI for all parameters. The YAML file holds most parameters; the CLI specifies which YAML file and network, and sometimes commonly-updated parameters. The YAML file holds parameters that were previously envvars, or somewhere in code. Here’s the default YAML file.Issue: envvars didn’t have enough fidelityv0.2 fix: use YAML file & CLI for all parameters. In the YAML file, each bot gets its own subsection, including which feeds to work with. The YAML has far more fidelity because it also includes variables that were previously in code.Issue: a messy CLI was organically emergingv0.2 fix: now we have a clean CLI. Previous calls to scripts for simulation, predictoor bot, trader bot, and various analytics are all now folded into the CLI. The CLI is implemented in new directory /cli/; its core modules cli_arguments.py and cli_module.py use argparse, the best-practices CLI library for Python. The CLI has unit tests and system tests.Issue: we wanted to extend pdr-backend functionality, and doing it in v0.1 code would explode complexity.v0.2 fix: YAML & clean CLI give a less-complex, more flexible foundation to build from. And we’re now nicely along in our progress: as we were building v0.2, we have also refined its architecture to have a proper data pipeline (in /lake/), the beginnings of a more powerful analytics app (in /analytics/), and are about to upgrade the simulation engine for more flexible and powerful experiments.
6. Conclusion
With Ocean Predictoor, you can run AI-powered prediction bots or trading bots on crypto price feeds to earn $. With pdr-backend v0.2, the interface to use predictoor bots & trader bots just got a lot simpler, via a CLI and using a YAML file for parameters. It also refactors backend code such to that we can do more powerful experiments around making $.
Get started at the pdr-backend’s https://github.com/oceanprotocol/pdr-backend.
Notes
[1] Two 1.86M values is a coincidence. Usually the values aren’t identical, though typically within 0.5x — 2x of each other
About Ocean Protocol
Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data.
Follow Ocean on https://twitter.com/oceanprotocol to keep up to date. Chat directly with the Ocean community on https://discord.gg/kwWmXxwBDY. Or, track Ocean progress directly on https://github.com/oceanprotocol.
See original
Ocean Protocol will host AMA X on April 25 Ocean Protocol will host an AMA at X focusing on the future of token systems and the Token Engineering Academy training season. The event is scheduled for April 25 at 11:00 UTC. Refer to the official OCEAN tweet Ocean Protocol is a blockchain-focused platform aimed at democratizing data exchange and monetization. It allows users to sell and buy data without having to reveal the source, giving people control over their data and allowing them to sell it without compromising their privacy. #OCEAN #oceanprotocol #news #ama $OCEAN
Ocean Protocol will host AMA X on April 25

Ocean Protocol will host an AMA at X focusing on the future of token systems and the Token Engineering Academy training season. The event is scheduled for April 25 at 11:00 UTC.

Refer to the official OCEAN tweet

Ocean Protocol is a blockchain-focused platform aimed at democratizing data exchange and monetization. It allows users to sell and buy data without having to reveal the source, giving people control over their data and allowing them to sell it without compromising their privacy.
#OCEAN #oceanprotocol #news #ama
$OCEAN
Celebrating a Year of Ocean Protocol's Remarkable Achievements! $OCEAN $ETH Hey, binance community! As we bid farewell to another incredible year, let's take a moment to reflect on one of the products shipped out by Ocean Protocol in 2023. It's been a year filled with groundbreaking accomplishments, community-driven innovations, and the continuous evolution of decentralized data economies. 🚀 Ocean Uploader Revolutionizes Decentralized Storage: A standout achievement this year has been the launch and widespread adoption of Ocean Uploader. This sleek tool has streamlined the process of uploading files onto decentralized platforms, making it more accessible and efficient for users. With seamless integrations with platforms like Arweave, Ocean Uploader has truly changed the game. I would love to know your experience while using the https://uploader.oceanprotocol.com/ #OCEAN #oceanprotocol #etf #crypto2024catch
Celebrating a Year of Ocean Protocol's Remarkable Achievements!

$OCEAN $ETH
Hey, binance community!
As we bid farewell to another incredible year, let's take a moment to reflect on one of the products shipped out by Ocean Protocol in 2023. It's been a year filled with groundbreaking accomplishments, community-driven innovations, and the continuous evolution of decentralized data economies.
🚀 Ocean Uploader Revolutionizes Decentralized Storage: A standout achievement this year has been the launch and widespread adoption of Ocean Uploader. This sleek tool has streamlined the process of uploading files onto decentralized platforms, making it more accessible and efficient for users. With seamless integrations with platforms like Arweave, Ocean Uploader has truly changed the game.
I would love to know your experience while using the https://uploader.oceanprotocol.com/

#OCEAN #oceanprotocol #etf #crypto2024catch
Exploring AI Coins Listed on Binance: Powering the Future of Blockchain and Artificial IntelligenceThe convergence of artificial intelligence (AI) and blockchain technology has ushered in a new era of innovation in the cryptocurrency world. AI coins, often referred to as the backbone of this revolution, represent digital assets that incorporate AI capabilities within their ecosystems. These coins enable automation, enhanced decision-making, and innovative applications across multiple industries, from finance to supply chain management. Binance, one of the world’s largest cryptocurrency exchanges, hosts several AI-related coins that are shaping the future of decentralized technologies. In this article, we dive into some of the leading AI coins listed on Binance. 1. Fetch.ai (FET) $FET Fetch.ai is at the forefront of merging AI with blockchain technology. This decentralized platform aims to create a self-sustaining economy of autonomous agents that interact to optimize services such as energy grids, transportation, and smart cities. By leveraging AI, these agents can make real-time decisions, communicate efficiently, and perform transactions without human intervention. FET, the native token of Fetch.ai, powers these interactions and allows developers to build AI-based decentralized applications (dApps) on the network. Why Fetch.ai? • Autonomous agents that revolutionize efficiency • Strong use cases in real-world industries like mobility and logistics • Growing developer ecosystem building AI applications 2. SingularityNET (AGIX) SingularityNET is an ambitious project aimed at creating a decentralized marketplace for AI services. By using the AGIX token, users can access and monetize various AI solutions, including machine learning algorithms, data processing tools, and more. The platform envisions a future where developers and AI researchers collaborate seamlessly on a decentralized network, further democratizing access to AI. Why SingularityNET? • Decentralized AI marketplace promoting collaboration • Empowering developers to share and monetize AI services • Visionary leadership with Dr. Ben Goertzel, a key AI researcher 3. Vectorspace AI (VXV) AI focuses on data analysis, providing AI-driven insights for various industries, including biotech, healthcare, and finance. Its unique approach uses Natural Language Processing (NLP) models to uncover relationships between data points, making it a valuable tool for researchers, analysts, and traders. VXV, the platform’s native token, facilitates transactions, rewards contributors, and grants access to its AI-powered datasets. Why Vectorspace AI? • Specialized in AI-driven data analysis for advanced research • Offers high-quality datasets to uncover hidden patterns • Strong applications in life sciences and investment sectors 4. Ocean Protocol (OCEAN) Ocean Protocol is designed to unlock the value of data through decentralized data exchanges. By using blockchain and AI, Ocean allows users to share, monetize, and exchange data securely. The OCEAN token enables transactions on the platform, rewarding data providers while maintaining transparency and trust. AI developers can use the protocol to gain access to high-quality datasets for training machine learning models, ultimately advancing AI research and applications. Why Ocean Protocol? • Decentralized marketplace for secure data exchange • Key use cases in AI model training and development • Empowering data owners while promoting privacy and control 5. Phala Network (PHA) $PHA Although not an AI coin at its core, Phala Network provides infrastructure crucial to AI operations, particularly in terms of privacy. Phala’s mission is to enable confidential smart contracts using trusted execution environments (TEEs), which ensures that sensitive data remains private during AI computations. This is vital for industries such as healthcare and finance, where confidentiality and AI processing intersect. Why Phala Network? • Focused on privacy-preserving computation • Complementary to AI operations in sensitive industries • Trusted by enterprises for secure AI data processing The Impact of AI Coins on the Future The role of AI coins in the broader blockchain ecosystem is transformative. By enhancing automation, decision-making, and real-time data processing, these tokens unlock new possibilities in industries ranging from autonomous transportation to healthcare. AI projects listed on Binance not only show promise in terms of technological advancement but also provide early adopters with investment opportunities in a rapidly evolving sector. For investors, developers, and enthusiasts alike, keeping an eye on AI tokens such as Fetch.ai, SingularityNET, Vectorspace AI, Ocean Protocol, and Phala Network can provide insight into where the future of AI and blockchain is headed. As these platforms continue to evolve, they will likely introduce more AI-powered services, contributing to the decentralization of AI technologies globally. $AI Not financial advice. DYOR. #fet.ai #agix #OceanProtocol #Phala #agixusdt {spot}(FETUSDT) {spot}(PHAUSDT)

Exploring AI Coins Listed on Binance: Powering the Future of Blockchain and Artificial Intelligence

The convergence of artificial intelligence (AI) and blockchain technology has ushered in a new era of innovation in the cryptocurrency world. AI coins, often referred to as the backbone of this revolution, represent digital assets that incorporate AI capabilities within their ecosystems. These coins enable automation, enhanced decision-making, and innovative applications across multiple industries, from finance to supply chain management. Binance, one of the world’s largest cryptocurrency exchanges, hosts several AI-related coins that are shaping the future of decentralized technologies. In this article, we dive into some of the leading AI coins listed on Binance.
1. Fetch.ai (FET)
$FET Fetch.ai is at the forefront of merging AI with blockchain technology. This decentralized platform aims to create a self-sustaining economy of autonomous agents that interact to optimize services such as energy grids, transportation, and smart cities. By leveraging AI, these agents can make real-time decisions, communicate efficiently, and perform transactions without human intervention. FET, the native token of Fetch.ai, powers these interactions and allows developers to build AI-based decentralized applications (dApps) on the network.
Why Fetch.ai?
• Autonomous agents that revolutionize efficiency
• Strong use cases in real-world industries like mobility and logistics
• Growing developer ecosystem building AI applications
2. SingularityNET (AGIX)
SingularityNET is an ambitious project aimed at creating a decentralized marketplace for AI services. By using the AGIX token, users can access and monetize various AI solutions, including machine learning algorithms, data processing tools, and more. The platform envisions a future where developers and AI researchers collaborate seamlessly on a decentralized network, further democratizing access to AI.
Why SingularityNET?
• Decentralized AI marketplace promoting collaboration
• Empowering developers to share and monetize AI services
• Visionary leadership with Dr. Ben Goertzel, a key AI researcher
3. Vectorspace AI (VXV)
AI focuses on data analysis, providing AI-driven insights for various industries, including biotech, healthcare, and finance. Its unique approach uses Natural Language Processing (NLP) models to uncover relationships between data points, making it a valuable tool for researchers, analysts, and traders. VXV, the platform’s native token, facilitates transactions, rewards contributors, and grants access to its AI-powered datasets.
Why Vectorspace AI?
• Specialized in AI-driven data analysis for advanced research
• Offers high-quality datasets to uncover hidden patterns
• Strong applications in life sciences and investment sectors
4. Ocean Protocol (OCEAN)
Ocean Protocol is designed to unlock the value of data through decentralized data exchanges. By using blockchain and AI, Ocean allows users to share, monetize, and exchange data securely. The OCEAN token enables transactions on the platform, rewarding data providers while maintaining transparency and trust. AI developers can use the protocol to gain access to high-quality datasets for training machine learning models, ultimately advancing AI research and applications.
Why Ocean Protocol?
• Decentralized marketplace for secure data exchange
• Key use cases in AI model training and development
• Empowering data owners while promoting privacy and control
5. Phala Network (PHA)
$PHA Although not an AI coin at its core, Phala Network provides infrastructure crucial to AI operations, particularly in terms of privacy. Phala’s mission is to enable confidential smart contracts using trusted execution environments (TEEs), which ensures that sensitive data remains private during AI computations. This is vital for industries such as healthcare and finance, where confidentiality and AI processing intersect.
Why Phala Network?
• Focused on privacy-preserving computation
• Complementary to AI operations in sensitive industries
• Trusted by enterprises for secure AI data processing
The Impact of AI Coins on the Future
The role of AI coins in the broader blockchain ecosystem is transformative. By enhancing automation, decision-making, and real-time data processing, these tokens unlock new possibilities in industries ranging from autonomous transportation to healthcare. AI projects listed on Binance not only show promise in terms of technological advancement but also provide early adopters with investment opportunities in a rapidly evolving sector.
For investors, developers, and enthusiasts alike, keeping an eye on AI tokens such as Fetch.ai, SingularityNET, Vectorspace AI, Ocean Protocol, and Phala Network can provide insight into where the future of AI and blockchain is headed. As these platforms continue to evolve, they will likely introduce more AI-powered services, contributing to the decentralization of AI technologies globally.
$AI
Not financial advice. DYOR.
#fet.ai #agix #OceanProtocol #Phala #agixusdt
This AI Coin Is Getting Ready to Burn: Analysts Expect a Rally! The AI ​​coin project Fetch.ai plans to burn 5 million tokens on January 10, 2025. In doing so, it aims to reduce supply and increase demand. The recent excitement over AI Coins has also fueled speculation about a potential rally for FET. Meanwhile, the $FET price has increased by 2 percent in the last 24 hours. Analysts are predicting a potential rally towards $3. Fetch AI co-founder Humayun Sheikh announced the burning of 5 million FET tokens on January 10. The initiative aims to reduce the total supply of FET tokens, which is expected to increase their value by creating scarcity in the market. Token burns are generally welcomed by the crypto community, as these transactions usually lead to increased demand for the remaining tokens and higher prices. This move aims to actively manage the token supply, thus indicating a strong commitment to increasing the long-term value of its ecosystem. In addition to the token burn, Fetch AI has completed its strategic merger with #OceanProtocol and #SingularityNET , thus creating the Artificial Superintelligence Alliance (ASI). This merger strengthens its position in the decentralized AI sector by offering a competitive alternative to the centralized control of technology giants over AI development. The merger also paves the way for a unified token for all three projects, the ASI token. According to experts, this is likely to further increase the price of the AI ​​coin. The community also has high expectations for the price increase, making this a very important period for the project. The Artificial Superintelligence Alliance (FET) has been steadily rising as anticipation builds around the FET token burn. At the time of writing, the FET price is trading at $1.47, up 2% in the last 24 hours. The token’s price has fluctuated between $1.44 and $1.50 over the same time period.
This AI Coin Is Getting Ready to Burn: Analysts Expect a Rally!

The AI ​​coin project Fetch.ai plans to burn 5 million tokens on January 10, 2025. In doing so, it aims to reduce supply and increase demand. The recent excitement over AI Coins has also fueled speculation about a potential rally for FET. Meanwhile, the $FET price has increased by 2 percent in the last 24 hours. Analysts are predicting a potential rally towards $3.
Fetch AI co-founder Humayun Sheikh announced the burning of 5 million FET tokens on January 10. The initiative aims to reduce the total supply of FET tokens, which is expected to increase their value by creating scarcity in the market. Token burns are generally welcomed by the crypto community, as these transactions usually lead to increased demand for the remaining tokens and higher prices. This move aims to actively manage the token supply, thus indicating a strong commitment to increasing the long-term value of its ecosystem.
In addition to the token burn, Fetch AI has completed its strategic merger with #OceanProtocol and #SingularityNET , thus creating the Artificial Superintelligence Alliance (ASI). This merger strengthens its position in the decentralized AI sector by offering a competitive alternative to the centralized control of technology giants over AI development. The merger also paves the way for a unified token for all three projects, the ASI token. According to experts, this is likely to further increase the price of the AI ​​coin. The community also has high expectations for the price increase, making this a very important period for the project.
The Artificial Superintelligence Alliance (FET) has been steadily rising as anticipation builds around the FET token burn. At the time of writing, the FET price is trading at $1.47, up 2% in the last 24 hours. The token’s price has fluctuated between $1.44 and $1.50 over the same time period.
See original
Ocean Protocol for GitHub Hosting Developer Dynamics Data Challenge Ocean Protocol is hosting a Data Dynamics Challenge on GitHub. The goal is to use GitHub data to analyze developer interactions and their impact on the project's crypto tokens. The competition will last from May 9 to May 28. The prize fund of the competition is $10,000. Refer to the official OCEAN tweet Ocean Protocol is a blockchain-focused platform aimed at democratizing data exchange and monetization. It allows users to sell and buy data without having to reveal the source, giving people control over their data and allowing them to sell it without compromising their privacy. #OCEAN #OCEAN/USDT #oceanprotocol #news $OCEAN
Ocean Protocol for GitHub Hosting Developer Dynamics Data Challenge

Ocean Protocol is hosting a Data Dynamics Challenge on GitHub. The goal is to use GitHub data to analyze developer interactions and their impact on the project's crypto tokens. The competition will last from May 9 to May 28. The prize fund of the competition is $10,000.

Refer to the official OCEAN tweet

Ocean Protocol is a blockchain-focused platform aimed at democratizing data exchange and monetization. It allows users to sell and buy data without having to reveal the source, giving people control over their data and allowing them to sell it without compromising their privacy.
#OCEAN #OCEAN/USDT #oceanprotocol #news
$OCEAN
Explore the latest crypto news
⚡️ Be a part of the latests discussions in crypto
💬 Interact with your favorite creators
👍 Enjoy content that interests you
Email / Phone number