Discussion about this post

User's avatar
Victor Bustos's avatar

I imagine a global research infrastructure where all raw data generated by publicly funded experiments is automatically streamed into a shared, standards-based database.

Every instrument, every lab, same day.

On top of this, an AI layer continuously cleans, annotates, and analyzes the data, linking it to protocols, samples, and prior results.

Researchers around the world can then query this living dataset in real time (“find experiments similar to mine where the antibody failed”), dramatically accelerating discovery and making publicly funded research truly public.

Gavin's avatar

> Perhaps a blockchain-like structure could help - tracking all the value generated, from initial invention, to later deployment and discovery, to final design, routing resources to those who could make contributions, and helping assign rewards to them later - perhaps when their work might have been otherwise forgotten, because the latter steps captured perceived value and obscured key initial steps. ... access to the blockchain of knowledge would require some kind of enforceable agreement about sharing of future credit and profits. But perhaps someone could figure that out.

Etica is doing something like that with their Retroactive Rewards protocol:

https://www.eticaprotocol.org/opr3-layer-concept

https://eticaprotocol.org/revolutionizing-drug-development

Overall, I feel that tracking the path intellectual connections isn't so hard, existing PIDs (e.g. DOIs) do a good job of that, and there are a few bibliometrics studies that have investigated, for instance, the citation network that leads to patenting a new drug. The hard part would seem to be allocating value to the early steps and, as you say, ensuring that credit and profit are shared.

No posts

Ready for more?