With our DFR solution we are aiming towards a mechanism where users are able to give a file rating after they have bought it. Later on we are integrating the DFR into our Oceancap solution. Users will be able to compare all the existing datapools additionally by the file rating score.
Since the inception of Ocean v3 and Ocean Market, we have seen lots of data being published and data pools created. Such phenomenon has created lucrative staking opportunities for Ocean community members. But not everyone has benefited from these opportunities. Only those who were first and early stakers have been served well at the cost of late joiners in any particular pool. Most people who made consistent profits were either technical or degens. Not all community members have the same technic
There will be 3 deliverables at the end of this phase.
The Ocean Surfer #1 Duck Dive series aims to immerse into the Ocean Protocol architecture and contribute with technical implementation and integrations with existing tools/protocols that can help to increase the number of use cases within the Ocean community to unlock a new data economy.
The proposed application aims to provide a mobile platform for all things “data economy”. It combines a customizable interface similar to Coingecko & Blockfolio, exclusively tracking data-tokens and providing pricing, liquidity and trading volume history to all data-pools on the Ocean Marketplace 1. The application also allows users to seamlessly connect their Ethereum address and update their staking wallet balance.
The Beta App will be live on iOS and Google Play Store within 3.5 months of a potential grant. We have already completed the full brief, followed by a full UI-design. The back-end will be integrated once completed.
Ocean Academy is a community initiative started by Ocean Ambassadors. Its goal is to lower the intellectual barriers required to grasp Ocean Protocol’s mental models and technology.
This is to provide an update on the group deliverables vs. what was promised with Project Shrimp.
Operation Plankton Bloom wants to be a first small step in the direction mentioned above. By seeking out active contact to the very active open data and open science community in Germany and elsewhere in the world it inquires about values, favorable conditions and (legal) requirements of the open data and open science community that have to be met to be an attractive space for data providers, consumers and prosumers.
Significant value can be sourced to understand crypto markets, prices, developments, regulatory landscape, use cases etc., through harvesting information form written text. Natural Language Processing (NLP) refers to the set of methods and analytical tools used to analyze unstructured text data, namely text that was created in free form and has a natural linguistic flow, rather than text created based on templated and predefined rules.
Initially, 5 Datasets will be published on the Ocean marketplace – one for each of BTC, ETH, LTC, TRX, XRP (and expanding in the future to include other projects with high following). Each dataset will provide cleaned, pre-processed, and featurized text data (as shown in the Value-Add Pipeline (VAP) in Figure 1) from every article, corresponding to 100,000s of n-grams and millions of tokens, from various news sources, e.g. cryptodaily.co.uk, cryptoslate.com.
The team at APY.vision is focused on bringing analytics to Liquidity Providers, a profession that did not exist even just 12 months ago. We are a tool that tracks fees collected and impermanent losses (IL) from providing liquidity to a pool. We want to empower liquidity providers by giving them the tools to analyse their performance by providing liquidity.
Because Ocean uses Balancer’s Smart Pools and currently a subgraph doesn’t exist yet, we wil first need to create or modify the exist one that we have. Afterwards we will add in the calculation on the backend to support fetching the P+L for the token gains. Finally we need to look at fetching the prices for each data token to reflect the gains / losses on the UI. Project timeline is about 4 weeks from the start of work.
We are building a network of non-consensus nodes that simply gather data for other networks to interpret. The goal is to make it trivially easy and highly cost efficient to access a structured data representation of any webpage, and eventually, to verify the authenticity or validity of the contents therein.
App will be live on our website (above) (and available for download for Mac, PC, and Ubuntu)
Software will be open-source with a permissive license at: github.com/w1kke
You may read more on the current project status, Phase 1 features on our Medium article linked here.
The final product is developed with the help of Dr. Ali Masood, a PhD project lead & data scientist who is working with a team of React Native and full-stack developers. The Phase 1 application will be deployed with beta features on the Google Play Store and Apple App Store by Q2/2021, whereas our Phase 2 application is a continuous development with an estimated completion of Q4/2021.
Project Shrimp is on track for completion by February 8, 2021. Current status:
February 8, 2021
This is to provide an update on the group deliverables vs. what was promised with Project Shrimp.
Currently, the docs related to Core Ocean projects like Aquarius, Ocean.js, Ocean.py, Contracts, Provider are not updated and thus, create an issue for developers who work on these projects. A developer has to wait for the docs to be updated or put their own time to go through the source code and changes in the new release to get their work done. This is a waste of valuable time for the developers.
rugpullindex.com 18, launched in November 2020 2 by Tim Daubenschütz 3 attempts to rank data sets by measuring their markets’ performance. We crawl all of Ocean Protocol’s markets daily and rank them by their liquidity and shareholder equality (Gini coefficient).
Our goal is to create an ERC20 token that allows an investor to gain exposure to the index’s top data sets. To do so, we’d like to follow the path of the DeFi Pulse Index 2. By receiving funding from the OceanDAO, we plan to implement the following milestones:
Round 2 project Status: Funded - Content Creation integration and customization to the marketplace. Onboarding the data in video format to ocean marketplace.
Internally, we seek to build out our own community to best leverage the benefits of DAO: meaningfully engaging community members at every level of talent and commitment, incentivizing this participation, and tying the success of the DAO to the benefit of all participants in the ecosystem.
By the end of Q1, we will package our first dataset per the aggregation of data from our community pertaining to market sentiment. This dataset will be made available on Ocean Marketplace. With the success of our inaugural dataset, we plan to build out many more data sets targeting relevant topics, crypto markets or otherwise.
Evotegra and Data Brokers have been proven to be a perfect fit with aligned interest. We are passionate about our business and convinced of the potential of Ocean Protocol. Together we will introduce Ocean Protocol and the Ocean Marketplace to other potential market participants with all the knowledge and experience we have gathered so far. We believe in the potential of Ocean being the connecting tissue of Germany’s and Europe’s future privacy-preserving and GDPR-compliant data economy.
A general framework is needed for the Economics of Science, especially in the time of data economy. Ocean can play a vital role in this regard, because it introduces market mechanisms to data services and it builds on FAIR data governance at the same times. FAIR means—Findable, Accessible, Interoperable, Reusable.
DataDAO mission is to allow the pooling of datatokens into a meaningful and valuable dataset who’s value is greater than the sum of it’s parts alone. Laying the foundations to a fairer and more inclusive value distribution in any product or platform, to the members who actually generate it.
MoonJelly is a Chromium browser extension expanded off of the original Jellyfish browser extension that let the user publish and search the Ocean commons (The original Jellyfish won first place in the Data Economy Challenge by Ocean Protocol back in 2019/2020). With the advent of datatokens and the Ocean market, we recognized that this abandoned project desperately needed to be updated—and that is what we have done. MoonJelly is currently usable and available through the Chrome Web store or thro
The field of neuroscience has generated an ocean of data, estimated to be in excess of an exabyte, collected from brain imaging studies of fish, rodents, primates, and humans. Most of this data remains siloed in individual labs - often in proprietary formats and inaccessible to the public. Recent movements in the neuroimaging community have embraced open science principles and a trickling stream of curated datasets has begun to flow onto public internet forums, making detailed measurements of th
A portion of the grant will be used to fund a corporate partnership with the INCF, giving Opscientia a seat at the table to promote the benefits of the data economy for Open Science via the Ocean Protocol.
ResilientML has developed methods in python to produce these JSON formatted text feature libraries that will form the core of our Semantic Reservoirs. These text-based data features are processed using specialized natural language processing NLP methods that ResilientML will bring to the Ocean community based on extensive academic and industry experience in developing such solutions.
Initially, 5 Datasets will be published on the Ocean marketplace – one for each of BTC, ETH, LTC, TRX, XRP (and expanding in the future to include other projects with high following). Each dataset will provide cleaned, pre-processed, and featurized text data (as shown in the Value-Add Pipeline (VAP) in Figure 1) from every article, corresponding to 100,000s of n-grams and millions of tokens, from various news sources, e.g. cryptodaily.co.uk, cryptoslate.com. The fixed costs (man hours + compute
The Shanties project is a continuation of the work we started with Ocean over two 1 years ago now, and that is to surface data insights for sustainability. Our startup Peniel Impact 2 has a mandate, and that is to accelerate the acceleration of data insights for sustainability. Our work covers all the Sustainable Development Goals (SDG’s) by the United Nations while also cutting across all the corporate function within the respective enterprises.
With this project, we hope to accelerate the translation of Oceans technology to our curated list of communities and partners to increase the volume and quality of ESG datasets.
The aim of the project is to build the next generation customizable open web3 social media platform that is powered by creators, supported by community and owned by everyone.
1st phase: Build foundation and strong core team. Create project roadmap. Conceptualize MVP.
This proposal is for funding for our Upload release. It will enable the community to contribute images + tags + descriptions to our image database. This will be rewarded with DataUnion.app Image vault tokens. We are using the funds of both proposals to reward our internal contributors for their work so far.
Following our successful OceanDAO participation in Round 1 and Round 2, we will expand the accessibility of our research with the help of alga., a free-to-use mobile application that is currently under development and where we have already written more than 5000 lines of code together with our team (expected completion Q2/2021).
Our business aims to onboard at least 2 reputable corporate data partners in 2021. We aim to curate and launch both IDOs that achieve at least 500’000 in community-staked OCEAN liquidity.
We want to give data scientists and Ocean Protocol investors a tool to make informed decisions about data sets. Our thesis is that a data set’s market is a proxy for its quality. Through providing high-res insights into the Ocean Marketplace, we can measure and improve the market’s overall health and performance.
Ocean Protocol has devised an amazing way to price datasets - using Balancer based data pools. But, in current setting a significant decision making power rests on datapool owner. We can say that all decisions for a given datapool (and underlying datatoken) is taken solely by datapool owner (i.e. data provider). This design creates a lot of issues and invites rugpulls.
We plan to release this dapp in beta by end of April 2021 or earlier.
I am active on Ocean Market and had stakes in many data pools at one point (still have in few). I always had to keep checking if my stake has reached a certain price point for me to exit the pool. Currently there is no automated way to create “limit order” (stake in or exit) for my stake in the pool. So, I decided to create this myself to allow myself and other community members to add/remove their stake when their pool shares reaches certain level.
Final deliverable will be web-based dapp + backend server + database to enable end-to-end operations
We saw this RFP thread 5, while looking for ideas to contribute and build something for Ocean community and found this idea there. With this project we aim to provide a monitoring platform to Ocean community (and team) to keep watch and track OCEAN balances in all major wallets (whale wallets) holding OCEAN. Such a dashboard dapp will help community to keep track of movements of OCEAN token for a given timeframe.
Dashboard dapp built using react.js, node.js, web3.js, Binance api, Uniswap smartcontract api, Sushiswap smartcontract api, Etherscan api and Bancor smartcontract api
So far $OCEAN token liquidity providers did not have the chance to vote on OceanDAO proposals as the current Snapshot strategy used by Ocean Protocol is only “erc20-balance-of” - check the OceanDAO Snapshot space here.
Software will be open-source with a permissive license at: https://github.com/w1kke/snapshot.js. This will then be merged to the Snapshot repository via a pull request. The Ocean Protocol Snapshot Space will be updated with the new strategy and Alex N. will push this upstream.
Dounty is a data bounty marketplace where bounty poster can post bounty with a reward in OCEAN tokens. Bounty Workers can then fulfil this bounty by publishing bounty work onto this bounty dapp built on Ocean Protocol and asking Bounty Posters to purchase their work. Once Poster consumes some worker’s work, they pay them pre-agreed bounty reward in OCEAN.
This phase is planned to be build complete by the end of the April 2021.
With Better Stats we want to build a fundamental analysis tool or screener for data pools in Ocean Protocol. This tool will help and educate Ocean community to check and track certain metrics of that particular datapool. Some of the key metrics that we plan to track includes (but not limited to) -
Dapp built using html, css, web3.js, node.js, mongodb, balancer abis, ocean.js, chart.js and other frameworks. This dapp will be accessible by web
Building on our comprehensive education platform designed to act as a catalyst for the adoption and growth of Ocean Protocol, project Kraken delivers a new educational module on Compute-to-Data (C2D). It aims to lower the barriers for data owners and data service providers to monetize private data while ensuring data ownership, control and GDPR compliance. With project Kraken we pursue the goal to explain and promote C2D, Ocean Protocol’s foremost advantage and a prerequisite for large-scale ent
This is to provide an update on the group deliverables vs. what was promised with Project Kraken.
Evotegra and Data Brokers have been proven to be a perfect fit with aligned interest. We are passionate about our business and convinced of the potential of Ocean Protocol. Together we will introduce Ocean Protocol and the Ocean Marketplace to other potential market participants with all the knowledge and experience we have gathered so far. Furthermore we will assist publishers and consumers in market exploration. We believe in the potential of Ocean Protocol being the connecting tissue of Germa
The field of neuroscience has generated an ocean of data, estimated to be in excess of an exabyte, collected from brain imaging studies of fish, rodents, primates, and humans. Most of this data remains siloed in individual labs - often in proprietary formats and inaccessible to the public. Recent movements in the neuroimaging community have embraced open science principles and a trickling stream of curated datasets has begun to flow onto public internet forums, making detailed measurements of th
The admissions data niche alone is estimated to be worth around one hundred million dollars annually with many analysts expecting this number to rise in the coming years. This is a perfect market to target with an outreach effort because we are introducing ocean protocol to universities and young college students. High school students would publish their grades, test scores, age, demographic information, race, ethnicity, and anything else they want. These data can then be collected into larger d
Q2 2021 - Create Digital Assets such as whiteboard video and a 5 second YouTube ad video.
The Currents Project is all about making the data on the Ocean Market the best data available anywhere. Our initial goal is to create an NLP-based sentiment dataset that contains all 150 of the top cryptocurrencies. In plain english; the dataset will be a valuable resource to understand both the shifting attention and attitudes of the crypto community. Most importantly, this data will be cleaned and standardized to be easily readable by code or machine learning algorithms to enable others to bui
First Milestone:
This project aims to empower and encourage OCEAN holders to participate in voting for Ocean DAO without withdrawing or shifting their OCEAN holdings from other platforms like exchanges and marketplaces.
Snapshot strategies for Uniswap, Balancer, Sushiswap and Ocean Market integrated in Snapshot-spaces repo.
Evaluate the regulatory requirements for operating decentral data exchanges in Germany (and Europe). Data Exchanges with financial transactions and the emission of data backed assets will most certainly fall under the scope of the German financial supervision authorities (Bundesaufsicht für Finanzen, BAFIN).
As a result of the DAO round we will get in touch with the BAFIN and if necessary other authorities to validate and document the legal requirements to operate a decentralized data exchange within the EU.
Evotegra GmbH provided the Object Detection Network and high performance inference engine that was used during the AI premiere in German live television show “Bundespolizei Live” provided by Kabel1 on 18/09/2019. During the 2 hour show our network processed 50 Full-HD television pictures per second on a single consumer grade GPU with low latency to reliably detect and obscure all faces in the image. This enabled us to meet the particularly high demands on the protection of individual privacy in
Initially, 5 Datasets will be published on the Ocean marketplace – one for each of BTC, ETH, LTC, TRX, XRP (and expanding in the future to include other projects with high following). Each dataset will provide cleaned, pre-processed, and featurized text data (as shown in the Value-Add Pipeline (VAP) in Figure 1) from every article, corresponding to 100,000s of n-grams and millions of tokens, from various news sources, e.g. cryptodaily.co.uk, cryptoslate.com.
Labelling data on a smartphone is not just inefficient. It is also introducing a human perception bias into the data, preventing AI from leveraging its full potential. And while datasets are getting continuously more complex, any additional complexity will deteriorate the quality achievable by humans. To solve this issue we introduce a revolutionary approach allowing humans to scale 100x while significantly improving the quality and hence the value of the data.
This proposal is for funding for our mobile. It will enable the community to source, and verify images + tags + descriptions in our image data vault using their mobile phones. This will be rewarded with DataUnion.app Image vault tokens. We are using the funds of the proposals to reward our internal contributors for their work.
The next milestone is the opening of the upload part of the platform to a wider audience. This is planned in March/April. After that the validation part of the platform will be opened.
Data Exchanges with financial transactions and the emission of data backed assets are likely to fall under the scope of the financial supervision authorities as “fractional securities”. The ultimate goal of this proposal is to establish a distinct legal framework on how to operate data markets. Good news: Some prerequisites have already been fulfilled (see below).
As a first step we will get in touch with the BAFIN and if necessary other authorities to validate and document the legal requirements to operate a decentralized data exchange within the EU.
Oort Digital is a DeFi aggregator for NFTs, a super account that allows people to do anything with their NFTs without a bank.
The Oort project will bring NFTs data to Ocean. We will firstly curate transaction data from NFT leasing, collateralized loan and fractionized trading and bring to Ocean. The initial data input will help to build an NFT valuation mechanism. With more effective NFT valuation, more DeFi applications can be built which will bring more data to Ocean to further enrich the whole NFT ecosystem.
An estimate of thousands of petabytes of data on human health, economic activity, social dynamics, and scientific observations of the universe and our impact on it are siloed in legacy institutional web infrastructure.
This grant requests funds to support:
Large transformer models have major commercial applications in audio, video, and text based AI. Due to the high cost of training and inference, it is not possible for most developers to utilise their own models, and thus they rely on centralised API access- which can be revoked at any time and comes with price and privacy concerns.
ResilientML has developed methods in python to produce these JSON formatted text feature libraries that will form the core of our Semantic Reservoirs. These text-based data features are processed using specialized natural language processing NLP methods that ResilientML will bring to the Ocean community based on extensive academic and industry experience in developing such solutions.
Datasets will be published on the Ocean marketplace – focus on rapidly emerging projects including Web3.0: Ocean, Streamr, Fetch AI; DeFi: Solana, Serum, Polkadot; Metaverse: Decentraland, Enjin, Red Fox, Axie Infinity etc. (and expanding in the future to include other projects with high following). Each dataset will provide cleaned, pre-processed, and featurized text data (as shown in the Value-Add Pipeline (VAP) in Figure 1) from every article, corresponding to 100,000s of n-grams and millions
Qualifier: We understood these proposals at the time as a bigger structure that would have a long term engagement horizon - with multiple stages. In order to show the community our long term roadmap we outlined a 5-month horizon with deliverables in stages. Thus one funded grant addresses some portion of this roadmap.
The field of neuroscience has generated an ocean of data, estimated to be in excess of an exabyte, collected from brain imaging studies of fish, rodents, primates, and humans. Most of this data remains siloed in individual labs - often in proprietary formats and inaccessible to the public. Recent movements in the neuroimaging community have embraced open science principles and a trickling stream of curated datasets has begun to flow onto public internet forums, making detailed measurements of th
This grant requests funds to support:
rugpullindex.com 4 helps data scientists and investors to make better decisions when buying data online. Our thesis is that markets are proxies for assets’ qualities.
Our ultimate goal is to develop a fully GDPR and regulatory compliant open source reference data market to enable adoption of data markets at scale by European enterprises and institutions.
A first iteration is planned to be released in May. Afterwards we intend to implement regulatory requirements in order to kickoff an iterative development process and feedback loop with all involved parties.
Our current dashboard is a single-page dapp (as shown in the attached screenshot) that provides important metrics and overview of $OCEAN liquidity across various exchanges in 3 different time frames - 1hour, 1 day and 1 week. What we have found out after talking to few community members and traders that such information though being good enough to track the current trend of OCEAN token, is not necessary enough to convince medium term investors to take any investment decisions based on the insigh
Dashboard dapp (https://oceandashboard.com) functionality will be extended to add more focused detailed view for each exchange to track historical liquidity details.
SecondLook is a data-as-a-service platform that allows users to generate realistic and privacy-safe synthetic data from sensitive personal data.
The Currents Project is all about putting datasets on the Ocean Market that are the best available anywhere . Our initial goal is to create an NLP (NLP = Machine Learning on Language) based sentiment dataset that contains all 150 of the top cryptocurrencies. In plain english; the dataset will be a valuable resource to understand both the shifting attention and attitudes of the crypto community. Most importantly, this data will be cleaned and standardized to be easily readable by code or machine
The field of neuroscience has generated an ocean of data, estimated to be in excess of an exabyte, collected from brain imaging studies of fish, rodents, primates, and humans. Most of this data remains siloed in individual labs - often in proprietary formats and inaccessible to the public. Recent movements in the neuroimaging community have embraced open science principles and a trickling stream of curated datasets has begun to flow onto public internet forums, making detailed measurements of th
This grant supports the following deliverables:
This tool aims to become an essential part of any data publisher or token investor. Putting organized on-chain analytics in the hands of Ocean Protocol users will allow these users to make better decisions. The added transparency to the ocean ecosystem will help build trust. The easily searchable data will help new Ocean users achieve a deeper understanding of the protocol more quickly. All of this leads to greater use, more impactful use, and more investment into the Ocean ecosystem.
Strict privacy laws & increasing consumer’s awareness of how companies use their personal data make data sharing a long, costly and risky process due to fear of breaches, fines and loss of customer’s trust.
This proposal is for funding of our annotation feature. This feature is very heavy and might not be completed in one month. It will enable the community to annotate images in our webapp. We are using the funds of the proposals to reward our internal contributors for their work.
This proposal is for funding of our annotation feature. This feature is very heavy and might not be completed in one month. It will enable the community to annotate images in our webapp. We are using the funds of the proposals to reward our internal contributors for their work.
While building towards a decentralized data economy powered by Ocean Protocol, we recognized the need for a use case library, dedicated to business/data owners, decision-makers, policymakers and regulators. “Ocean Use Cases” aims to find and document Ocean Protocol use cases with regard to real-world examples and collect them in an open-source deck to be used for outreach, awareness, lobbying and early onboarding.
Outreach/growth:
Posthuman is a Marketplace based on Ocean protocol that allows users to buy compute services on large NLP models. Model Providers contribute funds to train useful models, and Model Consumers purchase inference and evaluation on the models they find most useful. With Posthuman v0.2, Users can now train, infer, and evaluate on any arbitary text data.
Our solution could find use-case in many segments where content is generated rapidly, modified, and distributed in usual media formats. This can be utilized in micro-learning, assisted training, video documentations, and e-learning. The project is a unique mix of technology that enables others to come and build solutions on top of it.Our solution could find use-case in many segments where content is generated rapidly, modified, and distributed in usual media formats. This can be utilized in micr
Progress after previous Grant:
Round 5: Not Granted - Optimization of the user journey to sign the transactions involved in the OCEAN marketplace listing. Launching the Classroom, and onboarding Class records to Market.
rugpullindex.com helps data scientists and investors to make better decisions when buying data online. Our thesis is that markets are proxies for assets’ qualities.
Currently, it’s unclear which businesses have the most to gain from consuming datasets from Ocean and we believe the ocean community is still not clear on why data buyers would choose Ocean Protocol for their data purchases over and above existing providers.
This project has been complete. Findings from our research can be found here:
This proposal is to continue to extend, develop and further curate our initial development of a Crypto specific Natural Language Processing Data Suite.
Month 1:
In this round, we focussed on automation of the steps using a robust cloud infrastructure identified in round 4, as well as worked on academic papers utilising our methodology. Working on academic papers is important as they are subject to strict peer review procedures to ensure quality, which gives confidence in the way we process our datasets, as well as proves and increases their utility and market value.
The outcome of this collaboration between ResilientML and the OceanDAO community is multi-fold:
Together with course notes explaining the methodologies and concepts, we will provide R and Python code with examples of applications. All scripts and examples as well as course notes will be provided in advance of the course.
Description of the project: Create a marketplace of content creators using datatokens. The current perceived usefulness of the information provided by any given creator will be determined by the price of their datatoken. It’s no longer a secret how important data is to professional sports. Every single team and front office uses it to make decisions about their personnel and now as fans turn into GMs through daily fantasy - they rely on it just as heavily.
If we are selected for the grant it will strictly be used for software development and implementation of the Ocean protocol into our current MVP. We’re also looking for outside funding so the plan would be to then use those new funds for the full development of the app and grander market economy. We plan to hit the ground running so if we do receive the grant we want users to be able to start using datatokens on roto.life by June 1.
Currently the EVO/2MP/TRFC/DE/200K dataset contains 192,499 images with 807,415 individual labels in 274 classes. With this proposal we are going to extend the dataset by at least pedestrian and car as well as bicyclist, motorcycle and busstop if there are sufficient samples in the data. This will extend the dataset to over a million unique labels and significantly improve the value of the dataset. In our highly automated process all labels are created and verified multiple times by an AI superv
We are going to update the EVO/2MP/TRFC/DE/200K dataset with the classes:
We delivered 6 of 6 classes while switching one class from bus to train/tram/metro with actually twice as much annotations than promised.
Ocean Pearl is the first product that was initiated by the Ocean Tech Ship. The Ocean Tech Ship focuses on identifying problems that block the growth of the Ocean ecosystem.
Milestone 1: MVP release:
The primary focus of this initiative is to standup and integrate the Ocean Marketplace into our Vantage Signals platform, eventually enabling any user to consume any numerical data from the Ocean Protocol network and begin performing analysis on it.
Large transformer models have major commercial applications in audio, video, and text based AI. Due to the high cost of training and inference, it is not possible for most developers to utilise their own models, and thus they rely on centralised API access- which can be revoked at any time and comes with price and privacy concerns.
The central goals for this grant, deliverable over the next 30-45 days, are:
Outreach
Buidl Ocean is an online-community based initiative to:
This grant supports the following deliverables:
rugpullindex.com helps data scientists and investors to make better decisions when buying data online. Our thesis is that markets are proxies for assets’ qualities.
Spearheading the charge into streaming data support on the Ocean Network, we build an opensource “Ocean Vantage - Data Bridge” that will extend the Ocean network streaming capability to Webhooks and API support which we need to cater to this industry. We then use this framework to build a signal syndication network that will enable 100’s or 1000’s of quantitative analytics platforms to publish their signals to the market place and syndicate them across all the Ocean Marketplaces, connecting dire
Phase1 & Phase2: Stand up data gateway and marketplace fork
The Sports Analytics market is expected to reach USD 5.11 Billion annually by 2026, the Sports Medicine market is estimated at over USD 5 billion annually and the global market for athlete tracking systems was USD 2.26 billion in 2018.
Currently the EVO/2MP/TRFC/DE/200K dataset contains 192,499 images with 807,415 individual labels in 280 classes. With this proposal we are going to extend the dataset by at least the following road markings:
With this proposal we are going to extend the dataset by at least the following road markings:
A video streaming/casting solution where OCEAN protocol is used for ticketing. The webRTC will be stored on-chain using OCEAN Protocol and will be used for ticketing. Anyone interested in consuming a live-stream on the videowiki platform or connecting with our webRTC to stream at their own platform, will have to buy the access using OCEAN tokens.
Round 6: Researched various streaming options to onboard to our marketplace. Build UI around the Casting flow and conducted successful pilot events.
The value of an individual consumer’s data footprint is rising exponentially as the amount of data an individual creates every day increases and as privacy legislation prevents tech giants from extracting this valuable data. Solipay has developed a software platform that lets consumers easily share their entire data footprint and monetize that data on select, high-quality data marketplaces. To date, over forty thousand people from over one hundred eighty countries have shared over fifty million
A new listing on the Ocean datamarket that uniquely allows individual consumers to download Solipay, opt in to share data, then get paid in OCEAN for sharing their data with our synergistic platform integration.
Data Whale’s efforts towards encouraging adoption of a new data economy can be segregated into three main pillars.
Alternate Future Summit is an initiative designed to encourage developers, start-ups, technologists, and corporate management to unite and develop solutions that’ll help large scale recovery and define future.
The Ocean Ambassador Program was created to provide meaningful opportunities for community members to actively support Ocean Protocol long term goals.
In 2020 Facebook, Amazon and Google made $245.7 Billion in ad revenue while consumers received $0.
Q2 2021:
Large transformer models have major commercial applications in audio, video, and text based AI. Due to the high cost of training and inference, it is not possible for most developers to utilise their own models, and thus they rely on centralised API access- which can be revoked at any time and comes with price and privacy concerns.
Since the last DAO round, we’ve made major upgrades to our codebase, focusing on developing commercially useful variants of the models (DistilBERT and DistilGPT2) and algorithims that we shared earlier (with v1)- to accure value for Ocean holders and to enable corporate AI use-cases directly from Ocean Market.
This is the final app humanity needs - it gives us the ability to contribute to AI and robotics that will take over the world step by step. We want to give everyone the ability to use their data for a better future and their own profit.
The field of neuroscience has generated an ocean of data, estimated to be in excess of an exabyte, collected from brain imaging studies of fish, rodents, primates, and humans. Most of this data remains siloed in individual labs - often in proprietary formats and inaccessible to the public. Recent movements in the neuroimaging community have embraced open science principles and a trickling stream of curated datasets has begun to flow onto public internet forums, making detailed measurements of th
Longtail Financial is a squad of developers and data scientists in web3. Our team would like to answer the call to issues on tokenspice2 on github: https://github.com/oceanprotocol/tokenspice2/issues 5
App will be live, at: https://github.com/oceanprotocol/tokenspice2
• Nov 2020 - Jan 2021: Initial platform development
· Conference Workshops & Seminars
● Jul.-Aug: Detail project plan with partners (e.g. toilet owners, nano community) and kick-off
We have started to work:
Implement “allowNetworkAccess” and provide the open-source code in a public repository in GitHub. In addition, provide clear documentation on how build access control mechanisms to limit egress network traffic. We will provide developer documentation on how to build scalable and trusted kubernetes implementation. This code will also be open-source and available for anyone to use and be hosted on any web URL that the community decides on.
Milestone 2: Enhance current functionality
Activities scheduled for July 2021:
We will use this grant funds to built a javascript library that incorporates following features -
The central goals for this grant, deliverable over the next 30-45 days, are:
Integration of the OCEAN Protocol as part of RAZ Finance platform business development and investor presentations. Channelling of investment capital raised to the integration of RAZ as an SaaS platform with OCEAN as a datatoken and DEX platform.
Project updates:
1) Clinical review of publications about Blockchain in medical literature to identify existing public health applications and untapped opportunities for health data-driven applications based on the key network design principles extrapolated from OCEAN values.
Update 1:
[ ] Concept for the design and functionality of our mobile app.
Our image collection, annotation, and publishing process follows a waterfall-like structure.
Grant Deliverable 1
Our first dataset has been published to the Ocean Market.
In this round we will expand our offering to include:
Grant Deliverable 1:
[X] Professional and Scientific datasets collected over a multi year time frame from Walkers Reserve and other Environmental projects on Carbon Dioxide levels, turtle migration levels and wildlife surveys. These will be sourced from long standing conservation institutions in Barbados and the surrounding areas. These will be made available at high prices on the ocean marketplace.
Grant Deliverables:
Create an Ocean Protocol newsletter/blog for the Greek community that will include the most relevant Ocean Protocol blog posts translated in Greek and the podcasts (that are included in the blog posts) in video format with subtitles (Creation of English subtitles as well). Also, create social media channels in order to build the community. Hosted at https://www.oceanprotocol.gr
Team Opsci is happy to report that we have met all of our Round 9 deliverables and that work continues on our roadmap for Coral, our Open Science Marketplace dApp.
Milestone 3: Voting Leaderboard & UX improvements
[X] Professional and Scientific datasets collected over a multi year time frame from Walkers Reserve and other Environmental projects on Carbon Dioxide levels, turtle migration levels and wildlife surveys. These will be sourced from long standing conservation institutions in Barbados and the surrounding areas. These will be made available at high prices on the ocean marketplace.
*Grant Deliverable 1
1) Continue to extend, develop and further curate our initial development of a Crypto specific Natural Language Processing Data Suite – this includes specifically continuing to expand the three data sets TASPEL-27, PASCOR-89, INVPEN-41.
[ ] To finalize the following features for ALGA: Detailed Data Token Analysis for all Ethereum Pool Assets, ALGA wallet functionality [allowing for the first mobile Data Token transaction via WalletConnect on Ethereum!], Profile Creation, Integration of RugPull-Index Analysis [BUILD]
Grant Deliverable 1
[ ] Mobile App V2 & gamification concept
Milestone 1 / End of October 2021
We’ve publicly stated that we prefer retroactive funding. Hence, please consider this proposal an invoice and the above-enumerated deliverables as work that has been done and now needs to be paid.
We’ve [publicly stated](https://rugpullindex.com/blog#WereNOTAStartup) that we prefer retroactive funding. Hence, please consider this proposal an **invoice** and the above-enumerated deliverables as work that has been done and now needs to be paid.
Product & Services
Product & Services
10/2021:
Build / improve applications or integrations to Ocean
[] Train two sizes of PH Codex - PH-Codex-M and PH-Codex-J, based on the spec above.
Using this grant, we will build following features and artifacts -
We are working on structuring a bigger roadmap, where we plug in these data inputs through VideoWiki platform on to the ocean marketplace for the educational sale, purchase and subscriptions to the knowledge content.
[ ] Major Milestone! To integrate WalletConnect (Ethereum) to the current ALGA UI/UIX
[] Train PH-Codex-J based on the spec defined above. Share test loss, writing samples and other results.
We have decided to fund round 11 ourselves and apply for a grant of 1$, a symbolic amount to remain part of the ecosystem and see if the Ocean community values our Compute-to-SSI proposition.
Grant Deliverable 1
Grant Deliverable 1
Objective 1 - Community Onboarding
[ ] Gamification concept mobile app implementation sprint one
[ ] Major Milestone! To conduct the first Ocean Market transaction on a mobile application, to be reviewed on Etherscan.
[ ] upload example dataset to Filecoin Plus
[ ] Obtain feedback on our draft ‘lite’ paper from 15-20 people (including outreach and engagement with BCI, bio-hacker, neuroethics and Web3 communities)
[ ] 3+ Data Applications from content creators (2 are WIP, 1 internal)
1) 3 development sprints to code a patient-direct health data aggregator of chronic disease self-monitoring healthcare data from patients with verified diagnoses of diabetes, hypertension, and/or metabolic syndrome/obesity. -> Estimated cost 75%
The full spec for this proposal is available at the document below:
Hold a grand launch event to introduce Ocean Protocol in partnership with the blockchain Club of Uganda. Demonstrate how Ocean data markets operate. Create an Ocean East Africa telegram channel. We shall do translations to the Kiswahili (a widely spoken language in across East Africa). Fund the Makerere University Blockchain Club with transport to attend our launch event with intentions to enroll them on the Ocean Academy.Quantitative and qualitative growth in the Ocean community in Uganda and E
The final product is a service that collects, stores and displays sellers listings outside of the MLS. This valuable data can then be accessed directly by wholesalers or investors (who place a premium on listings outside of the MLS) and access can be sold to by sites such as Zillow and Realtor which will add more value to their platforms.
As i imagine it, this is a big project. I expect this first period will be used mostly on researching and trying to aggregate all the data i need from the core components of ocean. Once i have a stable backend (sorce of data) i will start planning more front-end use cases.
Objective 1 - Community Onboarding
This may be a bigger endeavor than i can handle but i will take it one step at a time.
Achievements/Deliverables
Please you can reference the document below to see the detailed description or explanation with proper links to the delivery checklist.
I have now completed a wide range of activities for this grant proposal.
Growth of the Japanese community (quantitatively measured by the number of followers and PV of the content)
ALGA
Grant Deliverables:
Our primary focus has been to tell people about the unique structure that Ocean Market has added to the data economy in the past month. In addition, as you have seen, we have presented the translations of articles and original content to our community to explain the foundations of the ecosystem. This month, we also aim to tell people about OceanDAO in all its aspects and to raise awareness to bring new projects from the Turkish community to Ocean Protocol. With this purpose in mind:
Last month we shifted our focus to Activation for those within the Ocean Missions ecosystem. This month the focus within the Ocean Missions ecosystem remains the same however we will also start to implement some Acquisition activities for Web3 data buyers and suppliers through the recently released Web3Access.org
The first phase is to provide tools for dataset publishers to better understand positioning for their dataset, which will be the main focus of the current grant.
*** Bu**ild a web page to facilitate the onboarding process for prospective content merchants and prospective buyers/users.
Round 13 - Requested $9.5k. $3k deployed from carry over. 6k deployed. 3K + R8 publish carry over to R14
January (R13)
NOTE: Since this round, we’re not retroactively funded anymore. Here’s what we want to do in February 2022:
ALGA
[Deliverables Checklist]
Objective 1: Community on-boarding
Objective 1 - Community Onboarding
In our previous proposal we mapped the OceanDAO data available, conducted research, defined KPIs, and specified the analytics framework. For this proposal our deliverables are:
Commercial
[Deliverable Checklist]
Using this grant, we will build following features and artifacts -
Deliverables for Round 14
[Unleash Data]
[Unleash Data]
Reorganization
Community on-boarding
The initial stage building Knowan, for which we are requesting funds in this proposal, are focused on community outreach and engagement to better involve and understand stakeholders from numerous spaces in the creation of a platform that truly addresses the needs of the users. Grant deliverables would be as follows:
ALGA
Deliverables from our last proposal (what we achieved last month) :
[ ] 3 Coordinape Epochs budgets with the fixed value of $1000/round.
Project Creator:
Commercial
Using this grant, we will build following features and artifacts -
We use OKRs to identify and prioritise grant deliverables:
[ Unleash Data ]