Nick Edwards

1.8K posts

Nick Edwards

Nick Edwards

@Nick___Edwards

Autonomous science. Founder and CEO at Potato (@readysetpotato). Former neuro at Brown, NIH, UCSD.

San Diego, CA Katılım Ekim 2012
1K Takip Edilen1.4K Takipçiler
Sabitlenmiş Tweet
Nick Edwards
Nick Edwards@Nick___Edwards·
We're building agents for autonomous science. Closed-loop, faster iterations, more discovery, less time. Massive human scientist + AI scientist collaboration.
Nick Edwards tweet media
English
6
13
117
61K
Nick Edwards retweetledi
Prachee Avasthi
Prachee Avasthi@PracheeAC·
This articulation is super helpful framing because it gets at a thing I also see on the front lines of drug discovery research. Namely that when we say AI dramatically accelerates the work, a key aspect is decreasing friction of cross-team collaboration. A clear example I see this happening for us is bridging the gap between the frontier computational scientists/theorists and experimentalists such that we don’t have to productionize computational workflows for broader internal use that are still under active tinkering/development. The iteration becomes tighter to experimentally test computational predictions while simultaneously improving and innovating on the computational approaches. The productionization by software engineers can happen downstream when we have greater confidence and experimental validation of the approach but now also with reduced tech debt or improved ability to deal with it. Other places I can imagine AI acceleration is giving computational folks greater intuition on the timescales, difficulty, and reliability of various experimental approaches to inform their bottlenecks. Another way to say all this is there’s inherent coordination headwind when you have deep technical experts in disparate domains trying to solve hard problems together. When everyone gains a just a little more breadth with the help of AI tools, what felt like cavernous gaps between siloed teams requiring heavy handed operational coordination now feels like more fluid collaboration.
Ash Jogalekar@curiouswavefn

My take on the whole "AI cures cancer in dog in Australia". It's a very interesting story, but perhaps not for the reasons that are being noted. In 2007, Freeman Dyson published an essay in The New York Review of Books called “Our Biotech Future.” It contains one of the most memorable predictions about the future of biology I’ve ever read. “I predict that the domestication of biotechnology will dominate our lives during the next fifty years at least as much as the domestication of computers has dominated our lives during the previous fifty years.” Dyson believed biology would eventually follow the trajectory of computing. At first, powerful tools live inside large institutions - universities, government labs, major companies. Over time those tools get cheaper, easier to use, and more widely distributed. Eventually individuals start doing things that once required entire organizations. “Biotechnology will become small and domesticated rather than big and centralized.” He even imagined genome design becoming something almost artistic: “Designing genomes will be a personal thing, a new art form as creative as painting or sculpture.” Dyson's words rang in my mind as I read the "AI cures dog cancer" story. Much of the coverage framed this as an example of AI discovering new science. But that’s not really the interesting part of the story. The scientific pipeline involved here is actually well known. It closely mirrors the workflow used in personalized neoantigen vaccine research that has been under active development for years. The steps are fairly standard: sequence the tumor, identify somatic mutations, predict which mutated peptides might be recognized by the immune system, encode those sequences in an mRNA construct, and deliver them to stimulate an immune response. The biological targets themselves were almost certainly not new discoveries (I have been unable to find out what they are, but mutations in targets like KIT which are common might be involved). Partly therein lies the rub, since the hardest part of drug discovery, whether in humans or dogs, is target validation, the lack of which leads to lack of efficacy - the #1 reason for drug failure. In neoantigen vaccines, the proteins involved are usually ordinary cellular proteins that happen to contain tumor-specific mutations. AlphaFold which was used to map the mutations on to specific protein structures is now a standard part of drug discovery pipelines. The challenge is identifying which mutated peptides might plausibly trigger immunity. What is interesting though is how the pipeline was assembled. Normally, this type of workflow spans multiple domains - genomics, bioinformatics, immunology, and translational medicine - and in institutional settings those pieces are distributed across specialized teams, document sources and legal and technical barriers. Navigating the literature, selecting computational tools, interpreting sequencing results, and designing a candidate mRNA construct is typically a collaborative process. In this case, AI appears to have helped compress that process, pulling together data and tools from different sources. Instead of requiring multiple experts, a motivated individual was able to assemble the workflow with AI acting as a kind of guide through the technical landscape. I’ve seen something similar in my own work while building lead-optimization pipelines in drug discovery. The underlying science hasn’t changed, but the friction involved in assembling the workflow can drop dramatically. Tasks that once required stitching together multiple tools, papers, and areas of expertise can now often be executed much faster with AI helping navigate the terrain; and by faster I mean roughly 100x. That kind of workflow compression is powerful, to say the least. When the cost of navigating technical knowledge drops, more people can realistically assemble sophisticated research pipelines. This story is a great example of what naively seems like a boring quantitative acceleration of the research process. In that sense, therefore, the real novelty here is not the biology but the combination of three things: a non-specialist orchestrating a complex biomedical pipeline, AI acting as a navigational layer across multiple technical domains, and the resulting decentralization of capabilities that were once confined to institutional research environments. But I think the story also points to something deeper, which is a challenge to modern regulatory environments. Modern biomedical innovation does not operate solely according to what is scientifically possible. It is structured by regulatory frameworks - clinical trials, safety oversight, institutional review boards, and regulatory agencies. Those systems exist for important reasons, but they also assume that the development of therapies occurs primarily within large, regulated organizations. When individuals begin assembling pieces of these pipelines outside those institutions, the relationship between technological capability and regulatory oversight starts to shift. The dog in this story sits outside the human regulatory framework. That fact alone made the experiment possible. In other words, the story is not just about technological capability; it is also about how certain forms of experimentation can occur when they bypass the regulatory pathways that normally govern biomedical innovation. One is reminded of another Australian, Barry Marshall, who received a Nobel for demonstrating through self-experimentation that ulcers are caused by bacteria. This raises an interesting question: what happens when the tools for assembling sophisticated biological workflows become widely accessible while the regulatory structures governing them remain institution-centric? That tension may ultimately be the most important implication of this moment. Regulatory frameworks will need to adapt to this kind of citizen science. Seen in this light, the story about the AI-assisted vaccine is less about a breakthrough in cancer therapy and more about a glimpse of the early stages of something Dyson anticipated nearly two decades ago: the domestication of biotechnology. If AI continues to reduce the cognitive overhead required to navigate biological knowledge and assemble complex pipelines, the boundary between professional research and motivated individuals may begin to blur. That shift will require careful thinking about safety, governance, and responsibility. But it also carries an exciting possibility. Dyson imagined a world in which biological design might eventually become something like a creative craft practiced not only by institutions but also by curious individuals experimenting at smaller scales. For a long time that vision felt distant. Now, it feels like we may be seeing the first hints of it.

English
0
2
10
2.8K
Nick Edwards
Nick Edwards@Nick___Edwards·
Fully agree with this take. Although I really hope that personal pets don’t become the next model organism
Ash Jogalekar@curiouswavefn

My take on the whole "AI cures cancer in dog in Australia". It's a very interesting story, but perhaps not for the reasons that are being noted. In 2007, Freeman Dyson published an essay in The New York Review of Books called “Our Biotech Future.” It contains one of the most memorable predictions about the future of biology I’ve ever read. “I predict that the domestication of biotechnology will dominate our lives during the next fifty years at least as much as the domestication of computers has dominated our lives during the previous fifty years.” Dyson believed biology would eventually follow the trajectory of computing. At first, powerful tools live inside large institutions - universities, government labs, major companies. Over time those tools get cheaper, easier to use, and more widely distributed. Eventually individuals start doing things that once required entire organizations. “Biotechnology will become small and domesticated rather than big and centralized.” He even imagined genome design becoming something almost artistic: “Designing genomes will be a personal thing, a new art form as creative as painting or sculpture.” Dyson's words rang in my mind as I read the "AI cures dog cancer" story. Much of the coverage framed this as an example of AI discovering new science. But that’s not really the interesting part of the story. The scientific pipeline involved here is actually well known. It closely mirrors the workflow used in personalized neoantigen vaccine research that has been under active development for years. The steps are fairly standard: sequence the tumor, identify somatic mutations, predict which mutated peptides might be recognized by the immune system, encode those sequences in an mRNA construct, and deliver them to stimulate an immune response. The biological targets themselves were almost certainly not new discoveries (I have been unable to find out what they are, but mutations in targets like KIT which are common might be involved). Partly therein lies the rub, since the hardest part of drug discovery, whether in humans or dogs, is target validation, the lack of which leads to lack of efficacy - the #1 reason for drug failure. In neoantigen vaccines, the proteins involved are usually ordinary cellular proteins that happen to contain tumor-specific mutations. AlphaFold which was used to map the mutations on to specific protein structures is now a standard part of drug discovery pipelines. The challenge is identifying which mutated peptides might plausibly trigger immunity. What is interesting though is how the pipeline was assembled. Normally, this type of workflow spans multiple domains - genomics, bioinformatics, immunology, and translational medicine - and in institutional settings those pieces are distributed across specialized teams, document sources and legal and technical barriers. Navigating the literature, selecting computational tools, interpreting sequencing results, and designing a candidate mRNA construct is typically a collaborative process. In this case, AI appears to have helped compress that process, pulling together data and tools from different sources. Instead of requiring multiple experts, a motivated individual was able to assemble the workflow with AI acting as a kind of guide through the technical landscape. I’ve seen something similar in my own work while building lead-optimization pipelines in drug discovery. The underlying science hasn’t changed, but the friction involved in assembling the workflow can drop dramatically. Tasks that once required stitching together multiple tools, papers, and areas of expertise can now often be executed much faster with AI helping navigate the terrain; and by faster I mean roughly 100x. That kind of workflow compression is powerful, to say the least. When the cost of navigating technical knowledge drops, more people can realistically assemble sophisticated research pipelines. This story is a great example of what naively seems like a boring quantitative acceleration of the research process. In that sense, therefore, the real novelty here is not the biology but the combination of three things: a non-specialist orchestrating a complex biomedical pipeline, AI acting as a navigational layer across multiple technical domains, and the resulting decentralization of capabilities that were once confined to institutional research environments. But I think the story also points to something deeper, which is a challenge to modern regulatory environments. Modern biomedical innovation does not operate solely according to what is scientifically possible. It is structured by regulatory frameworks - clinical trials, safety oversight, institutional review boards, and regulatory agencies. Those systems exist for important reasons, but they also assume that the development of therapies occurs primarily within large, regulated organizations. When individuals begin assembling pieces of these pipelines outside those institutions, the relationship between technological capability and regulatory oversight starts to shift. The dog in this story sits outside the human regulatory framework. That fact alone made the experiment possible. In other words, the story is not just about technological capability; it is also about how certain forms of experimentation can occur when they bypass the regulatory pathways that normally govern biomedical innovation. One is reminded of another Australian, Barry Marshall, who received a Nobel for demonstrating through self-experimentation that ulcers are caused by bacteria. This raises an interesting question: what happens when the tools for assembling sophisticated biological workflows become widely accessible while the regulatory structures governing them remain institution-centric? That tension may ultimately be the most important implication of this moment. Regulatory frameworks will need to adapt to this kind of citizen science. Seen in this light, the story about the AI-assisted vaccine is less about a breakthrough in cancer therapy and more about a glimpse of the early stages of something Dyson anticipated nearly two decades ago: the domestication of biotechnology. If AI continues to reduce the cognitive overhead required to navigate biological knowledge and assemble complex pipelines, the boundary between professional research and motivated individuals may begin to blur. That shift will require careful thinking about safety, governance, and responsibility. But it also carries an exciting possibility. Dyson imagined a world in which biological design might eventually become something like a creative craft practiced not only by institutions but also by curious individuals experimenting at smaller scales. For a long time that vision felt distant. Now, it feels like we may be seeing the first hints of it.

English
0
0
2
284
Nick Edwards retweetledi
Jason Kelly
Jason Kelly@jrkelly·
Today, we’re launching a new Cloud Lab protocol: Cell Free Protein Expression with HiBiT Quantification. Test expression of your protein designs in our cell free lysate and quantify the expressed proteins directly from the crude reaction mixture using @promega’s Nano-Glo® HiBiT Lytic Detection System. As with all of our cloud lab protocols, we offer transparent pricing and visibility into each step and instrument used to run your sequences and deliver the following data outputs: Expression Detection: Confirmation of target protein expression measured via background-subtracted luminescence (bcRLU) relative to controls. Relative Yield Quantification: Precise measurement of target protein concentration (reported in nM), interpolated from a standard curve. Assay Quality Metrics: Comprehensive statistical breakdown of the screening run, delivered with Z-prime scores, replicate CV%, and matrix effect controls to ensure data reliability. Place your order today: cloud.ginkgo.bio/protocols/cell…
English
1
7
21
1.8K
Nick Edwards
Nick Edwards@Nick___Edwards·
@ThatMrE Artificial artificial intelligence This hypothetically could work. The lobster neuroanatomy is fairly well known
English
0
0
2
82
Elliot Roth || SF
Elliot Roth || SF@ThatMrE·
Who wants me to help openclaw take over a real lobster with electrodes? I promise you a good time and lobster rolls at the end.
English
8
3
20
1.5K
Nick Edwards
Nick Edwards@Nick___Edwards·
@SynBio1 Please for the love of all that is good do this
English
0
0
3
163
Jake Wintermute 🧬/acc
Did you ever wonder what Shrek 2 would look like if you rendered it in the dimensions of a 1536-well microplate and color-mapped it to fluorescent reporter proteins, as if it were animated as agar pixel art? This is how I'm choosing to spend the extra productivity I get from AI
GIF
English
6
6
52
4.7K
Adam Draper ⏻
Adam Draper ⏻@AdamDraper·
@jrkelly I draw them, take a picture of my sketch. And then I use AI to ink them.
English
1
0
5
194
Nick Edwards retweetledi
Jason Kelly
Jason Kelly@jrkelly·
Great to see this bipartisan leadership from Congressman @JakeAuch, @RoKhanna, @JayObernolte, and @RepMcCormick ! We need to move off the manual lab bench and onto robotics in the US if we're going to remain competitive with China. Autonomous labs available as "Cloud Labs" will accelerate US scientists. @NSF with it's new Director @regardthefrost is the perfect agency to bring about this change in how we do science! 👏👏
Jason Kelly tweet media
English
5
15
66
8.3K
Nick Edwards
Nick Edwards@Nick___Edwards·
Autonomous labs + AI scientists = autonomous science Ginkgo's on fire. Stoked to collaborate with them.
Jason Kelly@jrkelly

Excited to launch the @Ginkgo Cloud Lab service today! Recently, GPT-5 ordered experiments from Ginkgo's autonomous lab in our work with @OpenAI below -- now we're making our lab available to users (or their AI models) in the cloud to order lab experiments and get back data online. Play around with it now! You can ask our agent about your protocol and it will do its best to evaluate if we can run it and what it would cost. cloud.ginkgo.bio/protocols To start we've launched 3 Ginkgo Certified Protocols, two around cell free protein expression and one to make bacterial pixel art 😀 We will be adding new protocols weekly -- at first ones we certify, but eventually users will order whatever experiment they want as long as we have the needed equipment on our autonomous lab! We hope that Cloud Labs will someday allow anyone to be a scientist with their own lab just like personal computers and cloud data centers democratized programming and the web. More in thread 🧵and happy to answers Qs if you post!

English
4
3
77
9.8K
Nick Edwards retweetledi
Jason Kelly
Jason Kelly@jrkelly·
Really fun to be on @Nick___Edwards podcast! Clip here but one of the essential things for autonomous labs to work is that scientists can use human language to order experiments. Love that companies like @readysetpotato are working on this problem! Full podcast here: buzzsprout.com/975439/episode…
English
9
5
26
3.1K