<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Writing on Reed Bender</title><link>https://reedbender.com/writing/</link><description>Recent content in Writing on Reed Bender</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Wed, 25 Mar 2026 00:00:00 +0000</lastBuildDate><atom:link href="https://reedbender.com/writing/index.xml" rel="self" type="application/rss+xml"/><item><title>Data Is the Moat</title><link>https://reedbender.com/writing/data-is-the-moat/</link><pubDate>Wed, 25 Mar 2026 00:00:00 +0000</pubDate><guid>https://reedbender.com/writing/data-is-the-moat/</guid><description>&lt;p&gt;The AI discourse right now is fixated on two things: model improvements and agent frameworks. Which model is best. Which orchestration layer is cleanest. How many tools can you wire into a loop. How many agents can you run in parallel? Every week my X feed is filled with a new harness, a new benchmark, a new claim about reasoning.&lt;/p&gt;
&lt;p&gt;None of it matters as much as the data.&lt;/p&gt;
&lt;p&gt;The models are converging. A &lt;a href="https://www.semianalysis.com/p/google-we-have-no-moat-and-neither"&gt;leaked Google memo&lt;/a&gt; called it in 2023: &amp;ldquo;We Have No Moat, And Neither Does OpenAI.&amp;rdquo; Since then, &lt;a href="https://huggingface.co/Qwen"&gt;Qwen&lt;/a&gt;, &lt;a href="https://www.deepseek.com/"&gt;DeepSeek&lt;/a&gt;, and Llama have all closed the gap. James Betker at OpenAI &lt;a href="https://nonint.com/2023/06/10/the-it-in-ai-models-is-the-dataset/"&gt;said it best&lt;/a&gt;:&lt;/p&gt;</description></item><item><title>Please Explain this Gap in your Resume</title><link>https://reedbender.com/writing/please-explain-this-gap/</link><pubDate>Sat, 21 Mar 2026 00:00:00 +0000</pubDate><guid>https://reedbender.com/writing/please-explain-this-gap/</guid><description>&lt;p&gt;There&amp;rsquo;s a stretch on my resume between leaving a computational biology role at Flagship Pioneering in Cambridge and showing up as a Senior Data Engineer at Lantern Pharma in Dallas that doesn&amp;rsquo;t look anything like career progression.&lt;/p&gt;
&lt;p&gt;I was on a farm in North Carolina.&lt;/p&gt;
&lt;p&gt;I&amp;rsquo;d spent two years at &lt;a href="https://www.flagshippioneering.com/companies/etiome"&gt;Etiome&lt;/a&gt; as the first software engineer, building ML infrastructure for single-cell RNA-sequencing and electronic medical records. We were modeling the biochemical factors of disease progression temporally, at single-cell resolution, trying to capture how disease actually evolves across patients where everybody is n=1. Great team, good data, interesting problems. But the higher the resolution got, the more it felt like I was getting a sharper and sharper image of the wrong thing. We could see what happened as disease progressed. We couldn&amp;rsquo;t see what caused it to progress. If the biochemical pathways were transistors, we were watching them flip without ever finding the programming language behind their orchestration. Those are two very different questions, and no amount of sequencer resolution was going to collapse them into one.&lt;/p&gt;</description></item><item><title>Our Superconducting Consciousness</title><link>https://reedbender.com/writing/our-superconducting-consciousness/</link><pubDate>Sun, 06 Feb 2022 00:00:00 +0000</pubDate><guid>https://reedbender.com/writing/our-superconducting-consciousness/</guid><description>&lt;figure&gt;
&lt;img src="https://reedbender.com/images/superconducting/hero-expand-amy-pennell.jpg" alt="Expand, by Amy Pennell."&gt;
&lt;figcaption&gt;Expand, by Amy Pennell.&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;blockquote&gt;
&lt;p&gt;&amp;ldquo;Behind it all is surely an idea so simple, so beautiful, that when we grasp it &amp;ndash; in a decade, a century, or a millennium &amp;ndash; we will all say to each other, how could it have been otherwise?&amp;rdquo;&lt;/p&gt;
&lt;p&gt;&amp;ndash; John Archibald Wheeler, 1986&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p class="post-note"&gt;&lt;strong&gt;Note:&lt;/strong&gt; If you would prefer to read this as a PDF manuscript, it is available &lt;a href="https://superconductingconsciousnesspublic.s3.amazonaws.com/Bender_Superconducting_Consciousness.pdf"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p class="post-note"&gt;&lt;strong&gt;2026 Author's Note:&lt;/strong&gt; This essay was originally published in February 2022. In September 2022, &lt;em&gt;Nature&lt;/em&gt; retracted the Snider &lt;em&gt;et al.&lt;/em&gt; (2020) paper on room-temperature superconductivity in carbonaceous sulfur hydride, which is cited in Section 4. The retraction states: &lt;em&gt;"We have now established that some key data processing steps used a non-standard, user-defined procedure... these processing issues undermine confidence in the published magnetic susceptibility data as a whole."&lt;/em&gt; (&lt;a href="https://doi.org/10.1038/s41586-022-05294-9"&gt;Nature 610, 804, 2022&lt;/a&gt;). The broader argument of this essay, that biological superconductivity may be possible drawing on Cope, Becker, and Mikheenko, does not depend on the Snider result. The relevant passage has been annotated below. I have left the original text intact for transparency rather than silently removing it.&lt;/p&gt;</description></item></channel></rss>