Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DataCap Application] BitsAndBytes - Hubble Space Telescope data #17

Closed
1 of 2 tasks
bitsandbytes03 opened this issue Aug 27, 2024 · 11 comments
Closed
1 of 2 tasks

Comments

@bitsandbytes03
Copy link

Version

1

DataCap Applicant

BitsAndBytes

Project ID

1

Data Owner Name

Hubble telescope full dataset

Data Owner Country/Region

United States

Data Owner Industry

Life Science / Healthcare

Website

https://archive.stsci.edu/missions-and-data/hst

Social Media Handle

archive@stsci.edu

Social Media Type

Other

What is your role related to the dataset

Data Preparer

Total amount of DataCap being requested

10PiB

Expected size of single dataset (one copy)

1PiB

Number of replicas to store

10

Weekly allocation of DataCap requested

500TiB

On-chain address for first allocation

f1paij7fdmzcgngc7v3mdoxuh6jvm4u5cl4uy7soi

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Custom multisig

  • Use Custom Multisig

Identifier

No response

Share a brief history of your project and organization

Bits&Bytes is a European organization that aims to impact the decentralized storage space in the near future significantly. We have access to our own computer rooms in Amsterdam and Belgium, capable of processing large volumes of data. Our team is well-recognized in the Benelux internet scene and other industries.

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

No response

Describe the data being stored onto Filecoin

We will store the Human Pangenome Reference Consortium (HPRC) open dataset from AWS. This is a project funded by the National Human Genome Research Institute to sequence and assemble genomes from individuals from diverse populations in order to better represent genomic landscape of diverse human populations.

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

If you are a data preparer. What is your location (Country/Region)

Netherlands

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?

We will use Singularity.

If you are not preparing the data, who will prepare the data? (Provide name and business)

No response

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.

The full set has not been stored completely on Filecoin and is not accessible in full. We aim to store it in full and have it retrievable.

Please share a sample of the data

This will be the AWS bucket.

Confirm that this is a public dataset that can be retrieved by anyone on the Network

  • I confirm

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Monthly

For how long do you plan to keep this dataset stored on Filecoin

More than 3 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, North America, Europe, Australia (continent)

How will you be distributing your data to storage providers

HTTP or FTP server

How did you find your storage providers

Partners

If you answered "Others" in the previous question, what is the tool or platform you used

No response

Please list the provider IDs and location of the storage providers you will be working with.

Holon Australia, Melbourne 1 replica's
DSS Australis, Sydney 1 replica
Dcent, The Netherlands 2 replica's
This is a dataset complementary for the OpenPanda project. If SP's are interested in storing, please contact us. We have 6 slot's available.

How do you plan to make deals to your storage providers

Boost client

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

Copy link
Contributor

datacap-bot bot commented Aug 27, 2024

Application is waiting for allocator review

Copy link
Contributor

datacap-bot bot commented Aug 27, 2024

Datacap Request Trigger

Total DataCap requested

10PiB

Expected weekly DataCap usage rate

500TiB

DataCap Amount - First Tranche

100TiB

Client address

f1paij7fdmzcgngc7v3mdoxuh6jvm4u5cl4uy7soi

Copy link
Contributor

datacap-bot bot commented Aug 27, 2024

DataCap Allocation requested

Multisig Notary address

Client address

f1paij7fdmzcgngc7v3mdoxuh6jvm4u5cl4uy7soi

DataCap allocation requested

100TiB

Id

7414ee5e-d576-441c-8f5d-b4dc9dadd8e3

Copy link
Contributor

datacap-bot bot commented Aug 27, 2024

Application is ready to sign

Copy link
Contributor

datacap-bot bot commented Aug 27, 2024

Request Approved

Your Datacap Allocation Request has been approved by the Notary

Message sent to Filecoin Network

bafy2bzacec4ctbkvr24lc5znc2k2j7emjcyfng54lezhiazrzqkrh56pzxnyq

Address

f1paij7fdmzcgngc7v3mdoxuh6jvm4u5cl4uy7soi

Datacap Allocated

100TiB

Signer Address

f14argljqmkt3ig4yhi5hqbm44eezafterkjj56uq

Id

7414ee5e-d576-441c-8f5d-b4dc9dadd8e3

You can check the status of the message here: https://filfox.info/en/message/bafy2bzacec4ctbkvr24lc5znc2k2j7emjcyfng54lezhiazrzqkrh56pzxnyq

Copy link
Contributor

datacap-bot bot commented Aug 27, 2024

Application is Granted

@cryptowhizzard
Copy link
Owner

Greetings

We have been experimenting with Curio on f02982293 the last 2 weeks. It seems there has been or is a bug somewhere causing duplicate CID's.

The amount is little ( only a few TB of datacap ) . We are investigating and solving the issue together with the Curio team.

@cryptowhizzard
Copy link
Owner

checker:manualTrigger

Copy link
Contributor

datacap-bot bot commented Sep 2, 2024

DataCap and CID Checker Report Summary1

Storage Provider Distribution

⚠️ 1 storage providers sealed more than 90% of total datacap - f02982293: 100.00%

⚠️ All storage providers are located in the same region.

⚠️ 100.00% of Storage Providers have retrieval success rate equal to zero.

⚠️ 100.00% of Storage Providers have retrieval success rate less than 75%.

⚠️ The average retrieval success rate is 0.00%

Deal Data Replication

✔️ Data replication looks healthy.

Deal Data Shared with other Clients2

✔️ No CID sharing has been observed.

Full report

Click here to view the CID Checker report.

Footnotes

  1. To manually trigger this report, add a comment with text checker:manualTrigger

  2. To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...

Copy link
Contributor

datacap-bot bot commented Sep 2, 2024

Client used 75% of the allocated DataCap. Consider allocating next tranche.

@cryptowhizzard
Copy link
Owner

Greetings

Unfortunately this application is beyond recovery.

  • We had connectivity problems with AWS ( Slow ).
  • A lot of files of this set changed on AWS before we were compiling the car files , leading to massive commP errors and incomplete data.
  • Curio <> Yugabyte still has problems. It leads to SQL errors and duplicate deals.
  • The Curio pipeline with snapping was not stable. ( This is fixed now but the damage has been done)

Therefore, it is best to close this one and revisit this dataset in the near future to make a second attempt to onboard it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants