Code is a liability (not an asset). Tech bosses don't understand this. They think AI is great because it produces 10,000 times more code but that means it's producing 10,000 more liabilities. AI is the asbestos we're shoveling into the walls of our high-tech society:
pluralistic.net/2025/09/27/eco…
--
If you'd like an essay-formatted version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
pluralistic.net/2026/01/06/100…
1/
Pluralistic: Code is a liability (not an asset) (06 Jan 2026) – Pluralistic: Daily links from Cory Doctorow
Pluralistic: Code is a liability (not an asset) (06 Jan 2026)Cory Doctorow (Pluralistic: Daily links from Cory Doctorow)


Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Code is a liability. Code's *capabilities* are assets. The goal of a tech shop is to have code whose capabilities generate more revenue than the costs associated with keeping that code running. For a long time, firms have nurtured a false belief that code costs less to run over time: after an initial shakedown period in which the bugs in the code are found and addressed, code ceases to need meaningful maintenance.
2/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
After all, code is a machine without moving parts - it does not wear out; it doesn't even wear down.
This is the thesis of Paul Mason's 2015 book *Postcapitalism*, a book that has aged remarkably poorly (though not, perhaps, as poorly as Mason's own political credibility): code is not an infinitely reproducible machine that requires no labor inputs to operate.
3/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Rather, it is a brittle machine that requires increasingly heroic measures to keep it in good working order, and which eventually does "wear out" (in the sense of needing a top-to-bottom refactoring).
To understand why code is a liability, you have to understand the difference between "writing code" and "software engineering."
4/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
"Writing code" is an incredibly useful, fun, and engrossing pastime. It involves breaking down complex tasks into discrete steps that are so precisely described that a computer can reliably perform them, and optimising that performance by finding clever ways of minimizing the demands the code puts on the computer's resources, such as RAM and processor cycles.
5/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Meanwhile, "software engineering" is a discipline that subsumes "writing code," but with a focus on the long-term operations of the *system* the code is part of. Software engineering concerns itself with the upstream processes that generate the data the system receives. It concerns itself with the downstream processes that the system emits processed information to.
6/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
It concerns itself with the adjacent systems that are receiving data from the same upstream processes and/or emitting data to the same downstream processes the system is emitting to.
"Writing code" is about making code that *runs well*. "Software engineering" is about making code that *fails well*.
7/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
It's about making code that is legible - that can be understood by third parties asked to maintain it, or who might be asked to adapt the processes downstream, upstream or adjacent to the system to keep the it from breaking. It's about making code that can be adapted, for example, when the underlying computer architecture it runs on is retired and has to be replaced, either with a new kind of computer, or with an emulated version of the old computer:
theregister.com/2026/01/05/hpu…
8/
The last supported version of HP-UX is no more
Liam Proven (The Register)Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Because that's the thing: any nontrivial code has to interact with the outside world, and the outside world isn't static, it's *dynamic*. The outside world busts through the assumptions made by software authors *all the time* and every time it does, the software needs to be fixed. Remember Y2K? That was a day when perfectly functional code, running on perfectly functional hardware, would stop functioning - not because the code changed, but because *time marched on*.
9/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
We're 12 years away from the Y2038 problem, when 32-bit flavors of Unix will all cease to work, because they, too, will have run out of computable seconds. These computers haven't changed, their software hasn't changed, but the world - by dint of ticking over, a second at a time, for 68 years - will wear through their seams, and they will rupture:
theregister.com/2025/08/23/the…
10/
The Unix Epochalypse might be sooner than you think
Richard Speed (The Register)Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
The existence of "the world" is an inescapable factor that wears out software and requires it to be rebuilt, often at enormous expense. The longer code is in operation, the more likely it is that it will encounter "the world." Take the code that devices use to report on their physical location. Originally, this was used for things like billing - determining which carrier or provider's network you were using and whether you were roaming.
11/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Then, our mobile devices used it to help determine your location in order to give you turn-by-turn directions. Then, this code was repurposed again to help us find our lost devices. This, in turn, became a way to locate *stolen* devices, a use-case that sharply diverges from finding lost devices in important ways - for example, when locating a lost device, you don't have to contend with the possibility that a malicious actor has disabled the "find my lost device" facility.
12/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
These additional use cases - upstream, downstream and adjacent - exposed bugs in the code that never surfaced in the earlier apps. For example, all location services have some kind of default behavior in the (very common) event that they're not really sure where they are. Maybe they have a general fix - for example, they know which cellular mast they're connected to or they know where they were the *last* time they got an accurate location fix - or maybe they're totally lost.
13/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
It turns out that in many cases, location apps drew a circle around all the places they *could* be and then set their location to the middle of that circle. That's fine if the circle is only a few feet in diameter, or if the app quickly replaces this approximation with a more precise location. But what if the location is miles and miles across, and the location fix *never* improves?
14/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
What if the location for any IP address without a defined location is given as *the center of the continental USA* and any app that doesn't know where it is reports that it is in a house in Kansas, sending dozens of furious (occasionally armed) strangers to that house, insisting that the owners are in possession of their stolen phones and tablets?
theweek.com/articles/624040/ho…
You don't just have to fix this bug once - you have to fix it over and over again.
15/
how-internet-mapping-glitch-turned-kansas-farm-into-digital-hell
theweek.comCory Doctorow
in reply to Cory Doctorow • • •Sensitive content
In Georgia:
jezebel.com/why-lost-phones-ke…
In Texas:
abc7chicago.com/post/find-my-i…
And in my town of Burbank, where Google's location-sharing service once told us that our then-11-year-old daughter (whose phone we couldn't reach) was 12 miles away, on a freeway ramp in an unincorporated area of LA county (she was at a nearby park, but out of range, and the app estimated her location as the center of the region it has last fixed her in) (it was a rough couple hours).
16/
Why lost phones keep pointing at this Atlanta couple's home
Kashmir Hill (Jezebel)Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
The underlying code - the code that uses some once-harmless default to fudge unknown locations - needs to be updated *constantly*, because the upstream, downstream and adjacent processes connected to it are changing *constantly*. The longer that code sits there, the more superannuated its original behaviors become, and the more baroque, crufty and obfuscated the patches layered atop of it become.
17/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Code is not an asset - it's a liability. The longer a computer system has been running, the more tech debt it represents. The more important the system is, the harder it is to bring down and completely redo. Instead, new layers of code are slathered atop of it, and wherever the layers of code meet, there are fissures in which these systems behave in ways that don't exactly match up.
18/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Worse still: when two companies are merged, their seamed, fissured IT systems are smashed together, so that now there are *adjacent* sources of tech debt, as well as upstream and downstream cracks:
pluralistic.net/2024/06/28/dea…
19/
Pluralistic: The reason you can’t buy a car is the same reason that your health insurer let hackers dox you (28 Jun 2024) – Pluralistic: Daily links from Cory Doctorow
Cory Doctorow (Pluralistic: Daily links from Cory Doctorow)Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
That's why giant companies are so susceptible to ransomware attacks - they're full of incompatible systems that have been coaxed into a facsimile of compatibility with various forms of digital silly putty, string and baling wire. They are not watertight and they cannot be made watertight.
20/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Even if they're not taken down by hackers, they sometimes just fall over and can't be stood back up again - like when Southwest Airlines' computers crashed for all of Christmas week 2022, stranding millions of travelers:
pluralistic.net/2023/01/16/for…
Airlines are especially bad, because they computerized early, and can't ever shut down the old computers to replace them with new ones.
21/
Pluralistic: 1,000,000 stranded Southwest passengers deserved better from Pete Buttigieg (16 Jan 2023) – Pluralistic: Daily links from Cory Doctorow
Cory Doctorow (Pluralistic: Daily links from Cory Doctorow)Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
This is why their apps are such dogshit - and why it's so awful that they've fired their customer service personnel and require fliers to use the apps for *everything*, even though the apps do. not. work. These apps won't ever work.
The reason that British Airways' app displays "An unknown error has occurred" 40-80% of the time isn't (just) that they fired all their IT staff and outsourced to low bidders overseas.
22/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
It's that, sure - but also that BA's first computers ran on electromechanical valves, and everything since has to be backwards-compatible with a system that one of Alan Turing's proteges gnawed out of a whole log with his very own front teeth. Code is a liability, not an asset (BA's new app is years behind schedule).
23/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Code is a liability. The servers for the Bloomberg terminals that turned Michael Bloomberg into a billionaire run on RISC chips, meaning that the company is locked into using a dwindling number of specialist hardware and data-center vendors, paying specialized programmers, and building brittle chains of code to connect these RISC systems to their less exotic equivalents in the world. Code isn't an asset.
24/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
AI can write code, but AI can't do software engineering. Software engineering is all about thinking through *context* - what will come before this system? What will come after it? What will sit alongside of it? How will the world change? Software engineering requires a very wide "context window," the thing that AI does not, and cannot have.
25/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
AI has a very narrow and shallow context window, and linear expansions to AI's context window requires *geometric* expansions in the amount of computational resources the AI consumes:
pluralistic.net/2025/10/29/wor…
Writing code that works, without consideration of how it will fail, is a recipe for catastrophe. It is a way to create tech debt at scale. It is shoveling asbestos into the walls of our technological society.
26/
Pluralistic: When AI prophecy fails (29 Oct 2025) – Pluralistic: Daily links from Cory Doctorow
Cory Doctorow (Pluralistic: Daily links from Cory Doctorow)Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Bosses *do not know* that code is a liability, not an asset. That's why they won't shut the fuck up about the chatbots that shit out 10,000 times more code than any human programmer. They think they've found a machine that produces *assets* at 10,000 times the rate of a human programmer. They haven't. They've found a machine that produces *liability* at 10,000 times the rate of any human programmer.
27/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Maintainability isn't just a matter of hard-won experience teaching you where the pitfalls are. It also requires the cultivation of "Fingerspitzengefühl" - the "fingertip feeling" that lets you make reasonable guesses about where never before seen pitfalls might emerge. It's a form of process knowledge. It is ineluctable. It is not latent in even the largest corpus of code that you could use as training data:
pluralistic.net/2025/09/08/pro…
28/
Pluralistic: Fingerspitzengefühl (08 Sep 2025) – Pluralistic: Daily links from Cory Doctorow
Cory Doctorow (Pluralistic: Daily links from Cory Doctorow)Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
*Boy* do tech bosses not get this. Take Microsoft. Their big bet right now is on "agentic AI." They think that if they install spyware on your computer that captures every keystroke, every communication, every screen you see and sends it to Microsoft's cloud and give a menagerie of chatbots access to it, that you'll be able to tell your computer, "Book me a train to Cardiff and find that hotel Cory mentioned last year and book me a room there" and it will do it.
29/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
This is an incredibly unworkable idea. No chatbot is remotely capable of doing all these things, something that Microsoft freely stipulates. Rather than doing this with one chatbot, Microsoft proposes to break this down among dozens of chatbots, each of which Microsoft hopes to bring up to 95% reliability.
That's an utterly implausible chatbot standard in and of itself, but consider this: probabilities are *multiplicative*.
30/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
A system containing two processes that operate at 95% reliability has a net reliability of 90.25% (0.95 * 0.95). Break a task down among a couple dozen 95% accurate bots and the chance that this task will be accomplished correctly rounds to *zero*.
Worse, Microsoft is on record as saying that they will grant the Trump administration secret access to all the data in its cloud:
forbes.com/sites/emmawoollacot…
31/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
So - as Signal's Meredith Whittaker and Udbhav Tiwari put it in their incredible 39C3 talk last week in Hamburg - Microsoft is about to abolish the very *idea* of privacy for *any* data on personal and corporate computers, in order to ship AI agents that cannot *ever* work:
youtube.com/watch?v=0ANECpNdt-…
32/
39C3 - AI Agent, AI Spy
media.ccc.de (YouTube)(URL replace addon enabled for X, YouTube, Instagram and some news sites.)
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Meanwhile, a Microsoft exec got into trouble last December when he posted to Linkedin announcing his intention to have AI rewrite *all* of Microsoft's code. Refactoring Microsoft's codebase makes lots of sense. Microsoft - like British Airways and other legacy firms - has lots of very old code that represents unsustainable tech debt. But using AI to rewrite that code is a way to *start* with tech debt that will only accumulate as time goes by:
windowslatest.com/2025/12/24/m…
33/
Microsoft denies rewriting Windows 11 using AI after an employee's "one engineer, one month, one million code" post on LinkedIn causes outrage
Mayank Parmar (Windows Latest)Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Now, some of you reading this have heard software engineers extolling the incredible value of using a chatbot to write code for them. Some of you *are* software engineers who have found chatbots incredibly useful in writing code for you. This is a common AI paradox: why do some people who use AI find it really helpful, while others loathe it? Is it that the people who don't like AI are "bad at AI?" Is it that the AI fans are lazy and don't care about the quality of their work?
34/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
There's doubtless some of both going on, but even if you teach everyone to be an AI expert, and cull everyone who doesn't take pride in their work out of the sample, the paradox will still remain. The true solution to the AI paradox lies in automation theory, and the concept of "centaurs" and "reverse centaurs":
pluralistic.net/2025/09/11/vul…
35/
Pluralistic: Reverse centaurs are the answer to the AI paradox (11 Sep 2025) – Pluralistic: Daily links from Cory Doctorow
Cory Doctorow (Pluralistic: Daily links from Cory Doctorow)Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
In automation theory, a "centaur" is a person, assisted by a machine. A "reverse centaur" is a person conscripted to *assist a machine*. If you're a software engineer who uses AI to write routine code that you have the time and experience to validate, deploying your Fingerspitzengefühl and process knowledge to ensure that it's fit for purpose, it's easy to see why you might find using AI (when you choose to, in ways you choose to, at a pace you choose to go at) to be useful.
36/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
But if you're a software engineer who's been ordered to produce code at 10x, or 100x, or 10,000x your previous rate, and the only way to do that is via AI, and there is no human way that you could possibly review that code and ensure that it will not break on first contact with the world, you'll hate it (you'll hate it even more if you've been turned into the AI's accountability sink, personally on the hook for the AI's mistakes):
pluralistic.net/2025/05/27/ran…
37/
Pluralistic: AI turns Amazon coders into Amazon warehouse workers (27 May 2025) – Pluralistic: Daily links from Cory Doctorow
Cory Doctorow (Pluralistic: Daily links from Cory Doctorow)Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
There's another way in which software engineers find AI-generated code to be incredibly helpful: when that code is *isolated*. If you're doing a single project - say, converting one batch of files to another format, just once - you don't have to worry about downstream, upstream or adjacent processes. There aren't any. You're writing code to do something once, without interacting with any other systems.
38/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
A *lot* of coding is this kind of utility project. It's tedious, thankless, and ripe for automation. Lots of personal projects fall into this bucket, and of course, by definition, a personal project is a centaur project. No one forces you to use AI in a personal project - it's always your choice how and when you make personal use of any tool.
39/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
But the fact that software engineers can sometimes make their work better with AI doesn't invalidate the fact that code is a liability, not an asset, and that AI code represents liability production at scale.
In the story of technological unemployment, there's the idea that new technology creates new jobs even as it makes old ones obsolete: for every blacksmith put out of work by the automobile, there's a job waiting as a mechanic.
40/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
In the years since the AI bubble began inflating, we've heard lots of versions of this: AI would create jobs for "prompt engineers" - or even create jobs that we can't imagine, because they won't exist until AI has changed the world beyond recognition.
I wouldn't bank on getting work in a fanciful trade that literally can't be imagined because our consciousnesses haven't so altered by AI that they've acquired the capacity to conceptualize of these new modes of work.
41/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
But if you *are* looking for a job that AI will definitely create, by the millions, I have a suggestion: digital asbestos removal.
For if AI code - written at 10,000 times the speed of any human coder, designed to work well, but not to fail gracefully - is the digital asbestos we're filling our walls with, then our descendants will spend generations digging that asbestos out of the walls.
42/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
There will be plenty of work fixing the things that we broke thanks to the most dangerous AI psychosis of all - the hallucinatory belief that "writing code" is the same thing as "software engineering." At the rate we're going, we'll have full employment for generations of asbestos removers.
43/
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
I'm coming to Colorado! Catch me in #Denver on Jan 22 at The Tattered Cover:
eventbrite.com/e/cory-doctorow…
And in #ColoradoSprings from Jan 23-25, where I'm the Guest of Honor at COSine:
firstfridayfandom.org/cosine/
Then I'll be in #Ottawa on Jan 28 at Perfect Books:
instagram.com/p/DS2nGiHiNUh/
And in #Toronto with Tim Wu on Jan 30:
nowtoronto.com/event/cory-doct…
44/
Perfect Books on Instagram: "❗BIG ANNOUNCEMENT TIME❗ We are SO excited to present author and activist Cory Doctorow as part of what we are now calling The Perfect Books Lecture Series. This event is presented in partnership with The Other Hill. We hope
Instagram(URL replace addon enabled for X, YouTube, Instagram and some news sites.)
Cory Doctorow
in reply to Cory Doctorow • • •Sensitive content
Image:
Cryteria (modified)
commons.wikimedia.org/wiki/Fil…
CC BY 3.0
creativecommons.org/licenses/b…
eof/
File:HAL9000.svg - Wikimedia Commons
commons.wikimedia.orgBaltergeist
in reply to Cory Doctorow • • •Chris Ford
in reply to Cory Doctorow • • •Sensitive content
Cory Doctorow reshared this.
RealGene ☣️
in reply to Cory Doctorow • • •Sensitive content
It should be called "digital asbestos *remediation*".
Just like real-world asbestos, you can never be sure it's completely gone after a cleanup; the best you can hope for is that you've reduced the hazard to the point it's not an immediate threat.
David Nash
in reply to Cory Doctorow • • •Sensitive content
> Software engineering requires a very wide "context window," the thing that AI does not, and cannot have.
The type of work I'm doing now (data engineering for a large organization) is full of the sort of software development you describe here. Whatever code I write has to cope with data coming in, from multiple not-mutually-friendly parts of the company, and it has to at least try to produce consistently comprehensible (and reasonably updateable) data for downstream users or processors. A huge part of even beginning to make that possible is understanding, in detail, what those parts of the company actually want or need. That's generally the most challenging part of my day job. The code is the easy part, and being able to puke out more code in less time rarely, if ever, solves the hard parts.
Cory Doctorow reshared this.
Bruce Heerssen
in reply to Cory Doctorow • • •Sensitive content
Cory Doctorow reshared this.
🔶Mark Nicoll 3.5%🏴🇬🇧🇪🇺🇺🇳
in reply to Bruce Heerssen • • •Sensitive content
Not even the center point of the complete postcode, but often just the center of the first half, which can cover a pretty wide area.
F4GRX Sébastien
in reply to Cory Doctorow • • •Sensitive content
grechaw
in reply to Cory Doctorow • • •Sensitive content
Cory Doctorow reshared this.
Herman 🇪🇺🇺🇦🇾🇪🍋
in reply to Cory Doctorow • • •Cory Doctorow reshared this.
Allpoints
Unknown parent • • •@angusm "Tech debt as a service" is my new all-time favorite description of vibe coding.
Thank you.
#Coding #DevOps #developers
katzenberger
in reply to Cory Doctorow • • •George B
in reply to Cory Doctorow • • •Cory Doctorow reshared this.
Paul_IPv6
in reply to Cory Doctorow • • •much like most pharmacists feel about drugs (useful if absolutely necessary but the less of them you take the better), coding is about having the least possible code to do what you want to accomplish.
in both cases, the more unnecessary stuff you have, the more likely the odd interactions and unpredictably overwhelm the usefulness.
SpaceLifeForm
in reply to Cory Doctorow • • •AI SLOC.
SLOC is the new productivity metric. Same as always. /s
Cavyherd
in reply to Cory Doctorow • • •Cory Doctorow reshared this.
Susanna the Artist 🌻
Unknown parent • • •mike805
in reply to Cory Doctorow • • •The root cause of this problem is the stock market. If we are ever going to fix what you call enshittification, the stock market is going to have to be drastically reformed.
Companies are not being managed to maximize their operating profit. We'd be a whole lot better off if they were. They are being managed to maximize stock price, and the problem with that is you can sell and not own the consequences.
𐁂𐀑𐀐𐁐
in reply to Cory Doctorow • • •Marty Heyman at COBOLworx
in reply to Cory Doctorow • • •of course, Corey, that makes vast quantitites of "mission" critical lines of code "liabilities." Hundreds of billions of lines of production #COBOL code for example.
But that's a glib misdirction. Lines of code, human or AI generated, that are in production are both a liability and a responsibility. So if you promote slop or carefully engineered code, you own it and it owns you.
Tofu
in reply to Cory Doctorow • • •Lightfighter
in reply to Cory Doctorow • • •Cory Doctorow reshared this.
Michael Vilain
in reply to Lightfighter • • •Electropict
Unknown parent • • •Sensitive content
Managers are addicted to using the available tools rather than those they need, and then because they have a tool refuse to think about what they really need. Measuring software quality by lines is the same as looking for your keys under a lamppost rather than where you dropped them. 🤷
Bill, organizer of stuff
in reply to Electropict • • •@electropict @phaedr0s Not addicted to – limited to. Executives are addicted to it, but managers suffer the consequences.
(source: Me. An ex-manager.)
FallsMom 🌻
Unknown parent • • •Friend who has a concrete pouring biz says "There are two kinds of concrete--the one that has cracked, and the other is the one that hasn't cracked yet."
Ben Cox
in reply to FallsMom 🌻 • • •Coach Pāṇini ®
in reply to FallsMom 🌻 • • •Underwriters successfully selling cyber insurance: “There’s only two kinds of companies, those that have been hacked, and those that don’t know it, yet.”
Jesse Alexander, WB2IFS/3
in reply to Cory Doctorow • • •Cory Doctorow reshared this.
Khleedril
in reply to Ben Cox • • •@ben @FallsMom @phaedr0s Also, there are two kinds of people:
1) Those that can extrapolate from incomplete data,
Jesse Alexander, WB2IFS/3
in reply to Bill, organizer of stuff • • •