Yup, the hard part isn’t writing code.
I’m always a bit cautious of the argument that the barriers are a good thing. That easily slips into harmful gatekeeping if we aren’t careful. It’s not good when programming is inaccessible or unwelcoming.
What •is• good is that writing code slows you down and (if you’re good) makes you •think• about what the heck you’re doing — the work @saraislet is talking about — with a depth and detail that no amount of chin-stroking and up-front design work can match. Skipping that work, however you skip it, is a false gain.
infosec.exchange/@saraislet/11…
Insecurity Princess 🌈💖🔥 (@saraislet@infosec.exchange)
One of the problems with vibe coding is that the hardest part of software engineering is not writing the code, rather it's *choosing* what to code, and designing the system (and, later on, maintaining the code/operations/etc) The barriers and invest…Infosec Exchange
Paul Cantrell
in reply to Paul Cantrell • • •Code is a confusing engineering object for our human brains. It is a bit like building a bridge or a kitchen utensil: it’s an object without a mind, it has a function, it can fail, people do unexpected things with it. But it can also feel a bit like a person given a task: it has behavior, it •decides•, it •acts•, it •causes•.
(All arguably true of a bridge too, but most of us don’t think of bridges that way!)
The lay understanding of code leans heavily on the idea that programs are anthropomorphic, little homunculi with agency.
2/
Paul Cantrell
in reply to Paul Cantrell • • •Where that understanding goes wrong is that humans have perception and experience and common sense. And yes, people are foolish and fallible — but ultimately humans are the adaptable element of complex systems[1], and when we design processes involving humans, we always, always lean on that adaptability.
If we say “walk out the door,” humans generally will not just walk face first into the wall just because the door is behind it.
Code will — unless you tell it not to.
[1] how.complexsystems.fail/#12
3/
How Complex Systems Fail
how.complexsystems.failPaul Cantrell
in reply to Paul Cantrell • • •So much of the work we do in software is about asking not just what •does• happen when the developer runs their own code, but what •could• happen when the code is running in the wild, out from under the watchful eye of its author.
That’s really, really hard. It’s a large portion of what makes development time-consuming and labor-intensive.
And it’s something you can only do if you actually understand what the code •means• — both to the computers and to the humans.
4/
Paul Cantrell
in reply to Paul Cantrell • • •A lot of the design work that goes into programming languages and tools is about prompting developers to •think about meaning•: tests, types, scope, compile errors, runtime errors — all about •preventing code from running• in the presence of an expectation/reality mismatch.
I’m always on high alert for tools that promise to speed development by letting developers skip the thinking.
5/
Bilal Barakat 🍉 reshared this.
Paul Will Gamble
in reply to Paul Cantrell • • •This AI vibe coding (“kids should not go into coding as a career”) era reminds me of that guidance counsellor in high school (circa 1995) warning us to stay away from the computer industry at large b/c he read an article about Moorse’s law (I assume) and figured they can only make these darn computers run so fast and then that’s it… this was before any true adoption of the Internet.
Hands on coding experience as you’re describing would dispense all the theory.
Ref: I’m still gainfully employed fixing mainframe code 🙃
Paul Cantrell
in reply to Paul Will Gamble • • •@paulywill
I remember a similar brief panic about offshore outsourcing. All the programming jobs were going to disappear to other countries…until they didn’t. Prediction is hard, especially of the future.
Paul Cantrell
in reply to Paul Cantrell • • •That’s not necessarily the only way to use machine learning to assist coding. It is, however, one of the “magic diet pills” promises at the heart of the current hype bubble.
Whatever the dev tools of the future look like, until we have Lieutenant-Commander Data, difficult human thought will be at the heart of writing code.
6/
Paul Cantrell
in reply to Paul Cantrell • • •In a saner environment, we’d be having a reasonable conversation about that: in what ways, if any, can a machine that repeats contextual patterns with no sense of meaning augment humans thinking through tricky things? In ways can it mislead? When, if ever, is it worth the tradeoffs? the resource costs? etc etc.
Right now, the off-the-charts money and hype make that reasonable conversation impossible except perhaps in hushed corners. (Please do not have that argument in my replies. I am tired.)
7/
Marc Trius
in reply to Paul Cantrell • • •Damper 🇨🇦
in reply to Paul Cantrell • • •