What I see - and have seen since I started doing this 30+ years ago - is that the date is _always_ more important than the actual deliverable. Always. Meeting "the date" is the only thing that's tracked (but it also never happens). It's even justified through vague analogies like Joel Spolsky's admonition that "you wouldn't buy a pair of jeans without knowing how much they cost" without ever doing a slightly deeper dive into how developing software is different than selling a pair of jeans.
All of the collaboration artifice that the author is referring to seems to me to always be a futile attempt to meet "the date". That software development might itself be _inherently_ unpredictable is never even considered, even though there are a lot of reasons to suggest that it is: by definition, the software you're developing has never been developed before, or else you could just use the thing that already exists.
I had a glimmer of hope in the late 90's when the agile manifesto was published - everything about it seemed to me to read "software development activities can't be coordinated like a wedding banquet can, but you can at least make sure that everything is tracking toward a shared understanding". I guess I shouldn't have been surprised when "Agile" became "tell me exactly what you're going to do and how long each step will take" almost the instant of its inception.
IMO (also 30 years in the biz), it's rarely the date, that's #2. it's the budget.
They'll forgive you if you're slightly late, they'll hate you forever if you ask for more money.
Agile works really well if you have a good product owner that has secured appropriate budget for the level of uncertainty in the endeavor & can make decisions and not be overridden by extrinsic forces. Everything else is negotiable.
To me, the _real_ thing that matters isn't quite date or budget, but something that somehow acts as an umbrella to both of them: the promise. When you promise to deliver something by a day, or within a budget, it's very clear whether you met your promise or didn't. However, when it comes to functionalities, there is more of a grey area: you can start to argue that something _mostly_ works, that some bugs are always inherent, or that this functionality actually is not really needed because the problem can be fixed in an operational way, or that the requirements have changed, or that it was just a nice-to-have... but money/time don't have this grey areas.
The natural extension to Spolsky’s quote is: Unless someone else is paying for the jeans.
I think the smaller the organization, the more likely that a software projects has real stakeholders. In bigger, more mature organizations, the experienced players have arranged their affairs so that their career progress doesn’t depend on delivery of software: Late, early, or ever. For instance I work on the “hardware” side of technology development, and I tailor my annual performance review goals so that a deliverable is satisfied when I can demo it with code that I’ve written myself.
> the date is _always_ more important than the actual deliverable. Always.
Hah! You just gave me an idea for a new methodology. Date-bound delivery.
- The business tells you what they want, as they do
- The business tells you when they want it, as they do
- The team does not say how long it will take. Instead, they say what they think they can deliver in the time allotted.
- As the date nears, more edge features get trimmed
- As the date arrives, something is always ready to deliver, no matter how miniscule
Such a methodology would ensure delivery, but not necessarily the contents of that delivery. Post mortems would no longer discuss why something took so long, and instead would focus on why features were cut.
If, as you say, the date is always more important, wouldn't such a methodology be worth trying?
that's really what agile was supposed to be. at least in the places where I saw it was successful.
every week, something is delivered, and is demoable, with approved tests from the business. That thing represents the most important thing to the business relative to the risk prioritization from engineering & usability prioritization from design.
every week, priorities can adjust, etc. and the cycle continues. hitting the actual 'release date' becomes much more knowable when you see the tangible date-driven progress on a regular cadence.
Yes, but expanded to the full deadline instead of only the short iterations.
The business does not care about week long deadlines. They need something on May 23 so they can achieve _______.
My understanding of Scrum (not representative of all agile, I know) is that the velocity is supposed to be tracked and used for better predictions. In my experience this takes a very dedicated core of people who are intent on making it happen. In other words, usually it doesn't happen.
But date-bound delivery is already our default mode of operation. We just don't like to admit it. We are going to deliver something on this date; we just don't know what, yet.
However the point of the weekly cadence is that the business does care about adjusting scope and priority towards hitting that deadline on May 23, so that they know what they're going to get on May 23 and have the power to adjust it.
Especially if the goal of what is delivered on that date is not clearly defined. It almost never is.
Most projects can be summed as "give me $X, I'll come back in 6 months, and ask for more time and money". or "here you go"... "that's not what I wanted".
It's a key risk mitigation toward a hard date to know every week if you're still getting what you wanted.
Velocity is overblown as a metric. It's one metric among many that can signal a few things (e.g. quality problems because bug fixes are overtaking features) but isn't as much of a lever as some say.
Yep I agree. Iterations are still good, demos are still good, ever-evolving scope discussions are still good, regardless of the overarching methodology.
That's because LLMs don't actually think, they pattern-match. Since all the existing estimations out there are made assuming that a human is going to perform the task, the estimation that the LLM provides has the same inherent assumption. The LLM doesn't have a corpus of LLM-led estimations so it cannot take that into account.
IMO, "ish". You can reliably and repeatedly produce good teams _if_ you reliably and repeatedly invest in your people.
IMO, what's really happening is that small, effective teams aren't _fungible_ - you can't just swap people around without breaking the magic in a team, and you can't just move a team around an organization without similarly breaking the magic (although the latter _is_ way more possible).
IMO, it's sort of an organizational version of "context switching". It takes time for a team to get up to gel and get up to speed. If you're swapping out team members, you break that cohesion. If you move around teams, you (somewhat) reset that "getting ramped up" process.
I feel a bit wacky even saying this, but I just started re-reading Team Topologies last week because it's starting to feel like the whole orchestration pattern only works reliably when roles and structure are clearly defined.
I wonder if that made it into the training set intentionally, or just as an unexpected side effect of stealing every character of text available on the internet with absolutely no curation?
It is true that on an individual level you get more work done if you collaborate less. But very often you will have solved the wrong problem. Collaboration often prevents that from happening because different point of views will make that clear before trying to solve a problem.
Strong words. I wonder if the author has PTSD from poorly managed teams and has never had the fortune to work in a high performance well managed collaborative environment. I agree these are rare compared to the other kind, but they exist. Groups of people can produce more than lone wolves. One person didn't build the pyramids, the Linux kernel, or Amazon Web services. Even when responsibility for a top level domain rests with a single person, you still have to coordinate the work of people building the individual components.
One of the features of my work, these days, is that I work alone. I worked in [pretty high-functioning] teams, for most of my career.
Teams are how you do big stuff. I’m really good at what I do, but I’ve been forced to reduce my scope, working alone. I do much smaller projects, than our team used to do.
But the killer in teams, is communication overhead, and much of that, is imposed by management, trying to get visibility. If the team is good, they often communicate fine, internally.
Most of the examples he gave, are tools of management, seeking visibility.
But it’s also vital for management to have visibility. A team can’t just be a “black box,” but a really good team can have a lot of autonomy and agency.
You need good teams, and good managers. If you don’t have both, it’s likely to be less-than-optimal.
They could review PRs and commits and specs to get visibility and reduce comms overhead, if they had the skills and time.
The non-technical manager also takes great conveniences in making technical people spend their time translating things. But no one ever asks the manager to learn new skills as much as they make developers do it.
Standups should eliminate almost all other meetings engineers need to attend. Except to go deeper on questions that came up in standup that cannot be instantly resolved.
there should be only 3 regular meetings in an agile engineering team
- weekly iteration planning (1-2 hours max)
- daily standup (15 mins max)
- weekly demo & retro (1-2 hours max)
literally everything else is work off the kanban board or backlog.
in my teams everyone was told to decline all meetings unless it explicitly led to the completion of a weekly planned story/task. this way all meetings for the team have a clear agenda and end in mind.
for mandatory external meetings & running interference with external parties, there are ways to insulate the majority of the team from that.
Is that three kinds of regular meetings? Because I count 8 meetings (and four kinds, as I don't think I've ever had demo and retro combined due to different groups of people being in both).
I will allow one more meeting to start a new sprint and end the previous one. Everyone should have prepared ahead of time to report on all their sprint items and whether they were completed, if not why not, and to present the work they will be doing in the next sprint.
If the Scrum Master or whatever their title schedules any other repeating process meetings, fire them.
Strong agree. When I started managing there was very little oversight. It wasn’t perfect and we went a bit astray, and we also did phenomenal work and had everyone on the team deeply engaged and moving with autonomy.
On my second team, the visibility theater took over, upper management set and reset and reset and reset our direction, and nobody was happy. In retrospect, I should have said no immediately. Trusting and empowering your people is hard to beat.
Communication overhead is a quadratic function. In teams with n people it takes n^2 time to keep everyone informed.
That's why the most effective teams are wolf packs - roughly 6-10 highly performant members where communication overhead is still low enough that it barely matters, but have enough people to be way more productive than an individual.
Obviously there's a minimum level of competence you need to have for this to work. The smaller the team the less freeloaders are tolerated.
It's a provocative title, but I think this section better captures his scope of argument - "Collaboration-as-ideology has made ownership and responsibility feel antisocial, which is a hell of a thing, given that ownership is the only mechanism that gets anything across the finish line.", as well as "But there’s a huge difference between communication and collaboration as infrastructure to support individual, high-agency ownership, and communication and collaboration as the primary activity of an organisation".
I think the author has identified that most organizations both fail at effective collaboration, and also use collaboration to paper over their failures.
I think the author maybe over-corrects by leaning on the idea that "only small teams actually get stuff done", and honestly I don't think anyone should be using SLA Marshall/Men Against Fire as an analogy for like... office work (if nothing else, even if you take his words at face value, then the percentage of US infantry who fired their rifles went up from 15-25% in WW2 to ~50% in Korea due to training improvements), but I can get behind the idea that a lot of organizations are setup to diffuse responsibility.
I also do think it's interesting to think about building the Pyramids. For the vast majority of people involved... I don't think modern audiences would call their work relationship or style "collaborative". Usually we use "collaborative" in opposition (at different times) to "working alone", "working with strict boundaries", and "being highly directed in what to do". Being on a work gang, or even being a team foreman is very much "no working alone", but those were also likely highly directed jobs (you must bring this specific stone to this specific location by this time) with strict boundaries.
Yeah, I think the author strays a bit away from the title.
The author says, "The collaboration industry has spent a fortune obscuring a dirty truth: most complex, high-quality work is done by individuals or very small groups operating with clear authority and sharp accountability" which means collaboration can work... in the right environment and with the right people. I work in R&D and I could not imagine not working in a collaborative environment. It's not reasonable to have expertise at everything and it's understood that things have to get done no matter whose name is on the ticket/story.
I also agree on you calling out Men against Fire example as well. That's not a collaboration issue, that's a training issue (amongst other things). And that problem went away as you said.
> By 1946, the US Army had accepted Marshall’s conclusions, and the Human Resources Research Office of the US Army subsequently pioneered a revolution in combat training which eventually replaced firing at ‘bulls eye’ targets with deeply ingrained ‘conditioning’ using realistic, man-shaped ‘pop-up’ targets that fall when hit. Psychologists know that this kind of powerful ‘operant conditioning’ is the only technique which will reliably influence the primitive, mid-brain processing of a frightened human being. Fire drills condition terrified school children to respond properly during a fire. Conditioning in flight simulators enables frightened pilots to respond reflexively to emergency situations. And similar application and perfection of basic conditioning techniques increased the rate of fire to approximately 55 percent in Korea and around 95 percent in Vietnam.
It was also probably never true. The author handwaves away 'disagreement about his methods', but SLA Marshall was also simply a liar. He claimed interviews he never did and lied about his own combat experience and the circumstances of his own commission.
Agreed. I came in the comments to say something similar. I think the author raises some interesting points worth consideration but their perspective is so incredibly cynical. He mentioned a small team that made the Apollo computer program. Well it took an awful lot more than a computer program to get to the moon. I don’t think anybody would argue that there are people who don’t pull their weight out there but there is so much evidence that people working together actually works that it makes you wonder who hurt the author so much.
There's also a lot of evidence it doesn't work. It's not either/or.
This piece is more of a whine about a certain kind of office culture, which the author - unreasonably - generalises to collaboration as a whole.
There's likely a lot of money to be made by identifying and defining good vs bad collaborative cultures.
Both are real. But a lot of "good" practices are more cargo culty than genuinely productive, and the managers who really do make it work seem to get there more by talent and innate skill than learned effort.
I fail to grasp the basis of folks knee-jerk dismissal of just about anything that strikes them as "cynical". Like, what world do you live in that cynicism isn't a signal of clear vision?
> Groups of people can produce more than lone wolves.
It's not a linear scale. A lone wolf can't produce the latest Assassin's Creed game. A committee can't produce Stardew Valley or Balatro. They're different capabilities, not a simple matter of more/less.
I can't say anything about how the Pyramids or AWS was build. But the Linux Kernels maintainence is full of responsibilities assigned to individual people.
yes, it seems that the author is against the typical corporate bullshit faux collab (where people are overloaded with distractions, and the whole culture is about "managing expectations", managing up, showing impact), not against delegation, supervision, review, and a few well positioned veto points
At first, I also thought that rejecting collaboration excludes any kind of teamwork, but then I noticed the quotation marks - so they're apparently only rejecting quote-unquote-collaboration (as in "collaboration theatre": endless calls with no tangible outcome, wanting to involve everyone in decisions etc.), not actual collaboration (which is also consistent with what the article itself says).
Depends on the problem being solved. And how frequently the core prob changes. Cuz nothing is static in an ever changing universe.
What organization, skills, leadership is required to explore a jungle for gold is very different from what organization, skills and leadership is required to run a gold mine.
So we get explore-exploit tradeoffs, satisficing vs optimizing choices etc.
Yeah, the sheer joy I've gotten from being part of a few collaborative teams in my career was amazing. It was like we all got smarter by working together.
I think the Author might have a lot of bad collaboration experience from working with teams that have low level of competence and agency, and especially in corporate, this highlights and accidentally resonates with me ( as of few months ago)
Laid off from a startup and moved fo corpos did gave me perspective,the first year working with the team works really well, we managed to get a lot of stuff really done and business were very happy.
And there came the Agile Coaches telling us to "Collaborate" while disguising as a need to serve his own agenda ( as he's also a PO for another squad ). So workshops on Collaboration, Explicit Expectation on PM have all authority and controls PO, for 8 freaking months just to get a competent team to work with a junior team with no agency nor even willingness to be mentored or do anything. So somewhat this incidentally aligns perfectly.
Corporate always manage to hire incompetent people, not firing them, and let others over-compensate for their failures, so yeah, its not really obvious but its there.
I believe the good collaboration can happen, but when people actually go of their ego and start listening and actually doing the work.
It's interesting that the author does not even consider the impact of incentives on performance. As Charlie Munger famously said, "Show me the incentives, and I'll show you the outcomes." It is true that collaboration becomes increasingly difficult as the team grows in size, but collaboration is not the fundamental problem. To manage a large team, the real challenge is to design incentives that properly reward those who produce and perform, and penalize those who don't. People respond to incentives (yes, it is a tautology, and that is precisely the point).
What kind of incentives are possible in your average tech work environment? A raise? A bonus? Raises usually come with more responsibility. I'm not familiar with tech companies doing bonuses.
Starts with how you evaluate employees for bonuses and promotion. Do you evaluate people on the impact of what was delivered? How fast they delivered feature work? The quality level of what they delivered? How well they worked with others?
The answers to basic questions like that already starts to shape behavior. If you pay zero attention to how people behave, and only look at impact of what was delivered you may promote people who optimize for their own work, but make others miserable. If you don't properly weight quality, especially now with AI code gen, you'll promote people who move fast break more things than is reasonable.
We can easily find examples of suboptimal behavior that arises out of poorly shaped rewards incentives at companies. Empire building is one behavior that is the result of managers getting promoted based on headcount. Stack ranking can and has led to people limiting collaboration with peers because someone has to fail in order for someone else to get a favorable rating. Or people avoid riskier work because failure can put you on the hot seat.
Money is the sledgehammer of incentives. Above a reasonable amount of pay, it's overkill and makes lots of collateral problems. The really effective incentives are status based and situational to the group dynamic
Sure, you did a great job on that last project, we've added 8 hours of PTO for you. No, you can't take it any time soon, we're far too busy for you to take any time off
The issue is that systems don't account for the diversity in how people are motivated and what parts of systems they are sensitive to and how they are sensitive to them.
By default in the dominant culture, most systems come down to individual incentives for individual drive and shame dynamics for collective drive, and that covers a decent chunk of how people are motivated, but leaves out people who are motivated differently and actively harms people for whom these are paralyzing.
> The average knowledge worker maintains accounts across system after system, switching between applications hundreds of times per day. And they produce, in aggregate, a staggering amount of coordinated and collaborative activity that never actually becomes anything resembling ~output.
The problem with this is conflating *output* for *impact*. A team of lone wolves writing 1k LOC by the hour is good output but not necessarily good impact.
A team with higher coordination overhead and "structural support" will probably have lower output, but if it focuses on significantly higher leverage activities might just have a better impact. The key question is whether that impact is visible and understood (often not) and lots of businesses are bad at understanding leverage.
I see this lone wolf BS from lots of founder types who mistake their own grind for real performance, often missing their own blind spots. I don't disagree that the 80-20% rule comes up and that some people have an outsized contribution overall compared to others, but to say that collaboration is dead as a result is throwing the baby out with the bath water.
This quote and the entire article could be extrapolated beyond the scope of an organization to highlight the importance of the notion of authorship in society as a whole:
> The collaboration industry has spent a fortune obscuring a dirty truth: most complex, high-quality work is done by individuals or very small groups operating with clear authority and sharp accountability, then rationalized into the language of teamwork afterward. Dostoevsky wrote _The Brothers Karamazov_ alone. The Apollo Guidance Computer came from a team at MIT small enough to have real ownership, hierarchical enough that Margaret Hamilton's name could go on the error-detection routines she personally designed.
Contrast this with the claims of “democratizing knowledge” and the image of a utopia where everyone contributes original work into a black box and expects no credit and no compensation in return (in fact, happily paying for the privilege of using it).
We, humans, like to have created something worthy of kudos. We pull the rope less hard when it’s a collective effort than when the rope is just yours alone.
As per the article—it describes an ideal organization where everyone, too, works towards achieving a similar goal.
The crucial difference highlighted is whether it involves one feeling responsible and recognised for their work on a particular part (even if it is destined to integrate with other parts, not unlike how a given human would with the rest of society).
Collaboration between us is the default (no one exists in isolation), but forcing a particular sense of collaboration onto people is a different thing.
That seems more like independant collaboration. Someoje built something without getting 10 cook to taste the broth. If its good, then someone will identify it for its merits and then build on top of it
I was skeptical about the claim that 80% of soldiers refuse to fire their weapons, so I did a little reading and it seems like the original source has been pretty much debunked. This 2011 article sums it up: https://scholars.wlu.ca/cmh/vol20/iss4/4/ but it's been doubted for decades.
I doubt whether Marshall was referring to soldiers in logitiscal roles when he made his claim about only 20-25% of soldiers firing their weapons, but I do wonder whether other people are getting confused by those numbers. About twenty years ago I looked up what the "tooth-to-tail" ratio was for various branches of the U.S. armed forces, and found anywhere from a 1:10 ratio for the army (10 soldiers in support roles not expected to see combat, v.s. 1 soldier on the front lines who would be expected to need to fire his weapon), to a 1:25 ratio for the air force (which had, naturally, a lot more support personnel, such as mechanics and so on, who would spend their whole military career in hangars or on bases and never actually flying a single plane). That's anywhere from 10% to just 4% of military personnel, depending on branch, who would be expected to fire at the enemy; the only time support personnel would be engaged in combat is if something had gone badly wrong militarily and their supply lines were being attacked.
So while the article you linked isn't confused on the subject, and I doubt Marshall was mixing support personnel in with front-line soldiers in his numbers, I do wonder whether there are people who confuse those two numbers: the number of soldiers, sailors, coasties, airmen, or marines who would never be in combat even during times of war, vs. the number who would actually be in combat and not fire.
(The article did address "what if the battle never came near where those particular soldiers were standing?", which was the other question I wondered about).
I agree. It seems impossible that its referring to support staff in those numbers. I had heard of similar studies in the British Army in ww1, with similar results (training on man-shaped targets etc) - surely the army would be unlikely to change tack based on a study with such an obviously flawed conclusion.
Not to mention the fact that this was a time of much more serious discipline issues. People were executed for desertion, and despite that many people did. There was also much malingering, up to and including literally shooting oneself in the foot. Is it so hard to believe that some people just hid when battles came?
Id be very surprised to hear from the other person that by Vietnam they had gotten it up to 95% though. My impression was that the most effective move away from this sort of thing was the move to a professional volunteer army, no conscription.
On Killing further develops the idea [0] by looking at a wider set of battles across time and, crucially, finds that by adapting training methods, the kill rate went up to beyond 90%. This then appears to come with higher rates PTSD.
It's frustrating to pull more weight and take ownership when other people aren't. But what's legitimately soul-killing to an individual and deadly to an organization is the collective impulse to avoid giving those people credit when it's due. Most of those 20% out there pulling more than their weight just want some acknowledgement. Not giving them that is one way to quickly hollow out your company.
I've never cared about this, actually. For me, the camaderie of the team is most important, and next comes the money. Acknowledgement from people who barely know what I do: I couldn't care less.
A lot leaps from riflemen, who obviously didn’t want to die (did you expect them to rush Medal of Honor style?), to system features to model office work? Whole essay is incoherent mess written by one of those lonesome “no-bullshiter” who gets the job done but is so pulled down by modern day bureaucracy that even his clairvoyance can’t get through.
> Dostoevsky wrote _The Brothers Karamazov_ alone. The Apollo Guidance Computer came from a team at MIT small enough to have real ownership, hierarchical enough that Margaret Hamilton's name could go on the error-detection routines she personally designed
I have good news for you, my jaded friend! What is similar between those people and you? You’re an individual! Therefore you could write another masterpiece yourself, you can be next Notch, next copyparty guy, next Stardew Valley guy and a long list of creations created by an actuallly high-performing individual, not some complainer who is oh so encumbered by stupid social dancing.
> A lot leaps from riflemen, who obviously didn’t want to die
Yeah but you'd think not dying involves killing those who want to kill you, or at least shooting at them! Isn't it super interesting to learn that 80% of riflemen don't ever shoot?
In a gunfight, you usually have to expose yourself at least a little bit in order to aim and fire. And let's say that you know an enemy soldier is around some corner, unaware, and you can pop out and shoot them. If there is another soldier aiming at your position, unbeknownst to you, you are dead.
a) other comment in the thread disproved the claim
b) even if it was remotely true, context matters. Refusing to shoot someone point blank because of reasons is one thing, refusing to go against Tiger 2 is another.
Yes, but that's also why the claim isn't true and has been criticized for years. It is so much more instinctive to simply pull the trigger even in a panic than sit there and do nothing.
You seem to ignore all the mountains of evidence that sense of responsibility drops in groups. The larger the group, the bigger the drop. This is not news, or non-sense.
> Every project now seems to carry more coordination overhead than execution time, and when it fails the postmortem just recommends more collaboration...
Or it gets stuck in code review cause one colleague likes nitpicking everything endlessly, so you’re stuck changing working code for multiple days.
Or they have questions and want to spend 2-4 hours in a meeting about design and how to do development “better”, bonus points for not writing anything down for future reference, them expecting you’ll keep a bunch of rules in mind. No ADRs, no getting started guides, no docs about how to do deliveries, probably not even a proper README.md, or versioned run profiles or basic instructions on how to get a local DB working (worst case, everyone uses the same shared instance).
Even more points for not even having retrospectives and never looking at stuff critically - why people keep creating layers upon layers of abstractions and don’t care about the ideas behind YAGNI/KISS. More so, no actual tooling to check things (e.g. code style, but also tools to check architectural stuff, and also obviously no codegen to deal with the overly abstracted bs).
It all depends on the project and team a lot. Some people have only had the fortune to work in locales and environments where stuff like that isn’t commonplace but rest assured, it can get BAD out there.
Working in a good team can be better than working alone, sure!
But working in a bad team is certainly worse than working alone.
Especially so when seniority is measured in years or nepotism and you’re told to not rock the boat and shut up cause “we’ve always done things this way”. I'm exaggerating a bit here, but I’m certain that plenty of people work in conditions not far removed from that.
The problem isn't "collaboration", the problem is people who don't really know what "collaboration" really means. Status reports, standups, committees, meetings with a large number of people, etc, does not make "collaboration" happen.
Collaboration isn't a process or a management technique -- it is a communication style. If you want collaboration, you can't take random people and use process to "make them collaborate" -- you need to build your team out of people who are collaborators.
Furthermore, collaboration is not at odds with accountability. Most of the highest performing collaborative teams I've ever worked on have people who are each individually highly accountable for their own contributions, and that's a critical part of what makes them a valuable collaborator.
> Collaboration isn't a process or a management technique -- it is a communication style. If you want collaboration, you can't take random people and use process to "make them collaborate" -- you need to build your team out of people who are collaborators.
Yes! I would add that IMO the communication style can be learned and there are great rewards for doing so.
I believe the rough statistic that 20% of people on a typical project are contributors. I don’t believe that it’s because the other 80% are losers. IME it’s because no serious effort has been made to include them, make sure they understand wtf is going on around them, and help them solve whatever is holding them back.
If you do this, a) it does work, and b) the need for small teams becomes apparent because the now-onboarded person can’t find anything that isn’t already being worked on, so they (with encouragement) start a new thing. And there are limits to people’s ability to understand what’s happening, especially if they’re inexperienced, and some people really don’t have the skills to contribute, but by and large, building bridges for people is still highly worth doing.
Sure. I trust people follow this to its logical end, which is how bad our mental model of "work" is. The more I think about it, this may get to the heart of all of our (bad) economic theory. Which is that, in a simple survival sense, there is no requirement whatsoever that "everybody works," or even contributes, but to have a functioning society, we do whatever it is we're doing now.
Something like UBI with extra steps.
And perhaps the bigger issue to get over, there perhaps ought not be a moral component to this, in a world where technology + a small number of people can easily take care of ALL actual needs.
A tip for the author, if they're here: those big flashing cursors at the top and bottom of the page make it exceedingly difficult for some of us to read the nearby text. Human eyes have evolved to follow the flashiest thing around, so are continually pulled towards those flashes while trying to read. Mine struggle badly with the top and bottom sections, where those cursors are in frame, to the extent that i can't be bothered to do so.
The lead story was about the “useless” soldiers in a battle that was won. I think as a minimum effort one should look for an example where the battle was lost? Most companies can only wish their outcomes were as good as the US in World War II.
As someone who is being actively "encouraged" to be more collaborative with a few non technical political type managers merely for the appearance of it, this rings true. Collaboration is great if you don't have a clue and can coast on someone's coattails.
Anyone who has worked in any large organisation knows exactly what I’m talking about.
My current employment is the first time I haven’t see this, the Pareto Principle.
The reason for this in biology is two fold:
1. Growth is distributed evenly as necessary to fill the containing system whether it is employment numbers or peas in a pod.
2. Resources are distributed to where they are demanded. Higher productivity individuals will consume a disproportionate number of tasks to complete as well as available resources. This difference is often statistically insignificant as a base difference but after compounding 20% of individuals account for 80% outputs and inputs.
Human behavior accounts for the same compounding because numerically growth distinction is similar enough.
This article is so true. "Collaboration" is how nothing ever gets done; we have this expression: "designed by committee"; we should also have "made by collaboration".
What's depressing is that it's like Fred Books' book never happened: most managers think the way to solve IT problems is just to trow more people / more money at it until it gets solved; and they're all surprised when it doesn't work, but try again the next time anyhow.
I think "design by committee" is a better target for criticism than collaboration in general.
If you get a bunch of people in a room and ask them for a design, one person is going to write the design while everyone else gets in the way. That's simply the nature of groups. The one person who writes it isn't even necessarily the best designer—they're just the one most willing to grab the whiteboard marker.
Conversely, if you ask one person to produce a preliminary design, they can leave, gather requirements, do research, produce a plan, and then convene everyone in a room to review it. Now all the abstract hypotheticals have been put to bed, the nebulous directionlessness has been replaced with a proposal, and the group can actually provide useful feedback and have a discussion that will inform the next draft of the design. And once the design is finished, everyone can easily work together to implement it as written. Collaboration is great, after someone has proposed a design.
That's part of what I like about the idea of Amazon's "culture of writing," though I've never worked in an environment like that in practice. Every idea needs to be preprocessed into an actionable memo before anyone tries to have a meeting about it.
Right. But more often than not, the problem that's being solved is "we have gotten money to throw at things", so the answer of throwing in many more people to busywork kind of makes sense.
That's before we even think about all the consultants and similar roles where busywork really is work. Then all the organizational or agile roles.
The fact that some product gets shipped and we still have customers is good, because that's what pays for it all, but that is just the foundation we all rest on. Almost like background noise.
Lets compare two projects that are collaborations:
Linux and Wayland.
Both are collaborative efforts, one has fairly effective and tyrannical leadership with the best interests of the community in mind. The other is lead by committee with competing interests and goals where they all have veto power.
Those same collaborators are reflected in the distro situation... Here is a group that also has some rather tyrannical leadership but they have dependencies (see the software they run) and some of those folks are sick of the distro's maintainers nonsense, and went to things like flat packs (see Bottles for an example).
> most managers think the way to solve IT problems is just to trow more people / more money
The reason you want to enforce a culture of collaboration is because that’s how you bubble up the smart people. The alternative is gatekeeping and cronyism.
Individual responsibility can just become a blame culture. I remember sitting near a team that worked like this - meetings with everyone trying to prove that some screw-up was actually due to someone else.
In such scenarios nobody wants to stick their neck out at all, everyone hates everyone else.
At a higher level the usual problem is with incentives being different from one team to another. If you want something done you have to start with the incentives rather than expect people to work against them and there does have to be leadership to break deadlocks.
> But there’s a huge difference between communication and collaboration as infrastructure to support individual, high-agency ownership, and communication and collaboration as the primary activity of an organisation.
that is a meaningless buzzword salad masquerading as a deep insight
I think it's just explaining the difference between comms/collab being a supporting thing vs the only thing? It doesn't seem intended to be deep to me, but it's a little verbose.
News just in, too much of a good thing sometimes is bad. After the break, new study reveals: the sky is (sometimes) blue and the grass is (sometimes) green.
What it misses is that the 80% of soldiers who were not firing was still required. Not everyone has the same product, and someone’s product exists at an abstraction layer above the outcome and towards the organisation that builds it, as ugly and inefficient as it may be judged in comparison to an army of perfect contributors that does not exist.
I think the author is overfitting. Collaboration and ownership are actually not in tension. _Bad process_ and ownership definitely can be. You can still have a high performance, high accountability culture that is collaborative.
Thought-provoking essay. I can see how responsibility and ownership are important to help identify, motivate and reward the high achievers (and conversely, identify and get rid of the "dead wood"). But I can also see how collaboration and the dilution of responsibility and ownership helps better integrate junior members who might otherwise stay on the sidelines for longer than they should. There's also the issue of personnel turnover: what happens if the one person who is responsible for a major piece of a project leaves the company? A collaborative setting is more resilient to churn. There are trade-offs, and possibly a middle ground to be found.
Extremely good example of this is the fact that all major breakthroughs that pushed out civilisation further were done by highly motivated an genious individuals.
Im not saying teamwork is not needed, but it for sure should not be #1 indicator for teams.
Vetting for character to skip dycks is till fine in my eyes.
When this article opened with a World War II story, I thought that the “collaboration” being discussed was people aiding the occupying forces. Sadly, it turned out to be less interesting than that.
Office lives matter! Do you know how much PTSD I have from waiting for my morning latte in our office coffeeshop while being late for standup? All of it!
Perhaps, but during the initial design stages it is the core job function to figure out how to improve the current workflow quality.
If folks don't clearly define the finish line, than a project will run out of budget eventually as scope creep turns it back into the same useless mess.
Proper restructuring usually means firing people that do not recognize they are in a business setting. The Big Brain fallacy is a common bias with people too blind to recognize what other professions bring to the table, and ultimately are unproductive in a large project.
Every manager should read Moneyball before building operational teams with HR, and avoid the trap of the costly Big Brain =3
"Moneyball: The Art of Winning an Unfair Game" (2003, Michael Lewis)
Collaboration has structure. The structure is the result of "the activity to create and maintain a shared understanding of a problem in order to solve it" - which is a definition of collaboration. I don't think collaboration requires a hierarchy more than it requires a tool for groupwork.
Above N people it probably does. Except rare cases where it is embarssingly parallel focused mission. Hacker groups, searching for a missing person etc.
I didn’t get anything out of Mythical Man Month, which I didn’t expect to anyway since it’s a management book. But for whatever reason it’s talked about as if it is a classic here. In part I think because it promotes the idea of the ten-x-er and how teams ought to be organized around supporting the ten-x-er.
> ...every unilateral decision gets read as a cultural violation and a signal that you aren’t a team player. Collaboration-as-ideology has made ownership and responsibility feel antisocial, which is a hell of a thing, given that ownership is the only mechanism that gets anything across the finish line.
This is definitely how it can feel sometimes, but it's simply not true. The problem really is just poor communication and big knowledge gaps.
I do understand that not all workplaces allow for enough discussion. Arrogant behavior from leadership is just them cracking under the pressure. You should know that you're never the only one noticing it.
Don't get dragged down with them. You can voice your concerns candidly with the right people who care the most about the outcomes and let the chips fall where they may, or you can suffer in silence like they expect you to. It's sad that this is the expectation because these conversations are just as much a part of "collaboration" as anything else. I expect there may be some defensive replies from certain types of people who feel threatened by this idea.
What you should never do is let this stuff get under your skin or take it personally. The fact of the matter is, teamwork is the nature of any business. When people go rogue, it only makes the problems worse. Everyone misses out on opportunities to grow and the project suffers from the lack of coordination to continue long term.
If they react that way, it's their loss. Expect better and you'll receive better. You'd be miserable sticking around anyway.
The only reason weak management is allowed to exist is because people never speak up. If you're the kind of person confident enough to speak up, you might be let go but will almost certainly find something better too.
If you're not that kind of person... well then fine go ahead and suffer. Is what it is.
Aren't many successful projects managed by a 'benevolent dictator'. That used to be a big deal around the Valley. Now everything is team work and collaboration.
I have a coworker that will take projects out from under you and do all the work themselves if you ask to collaborate with them. It happened twice, one of them he out right took all the credit and handed me peanuts. The person knows this so they try to keep you involved by having you review their work, which is usually pointless because they’ve already done all the work and there are no alternatives or caveats to consider. I will never work with them again even though they do good work and are sharp, purely because they’re a control freak and it slows everyone down.
It is interesting to pick on an example like the Battle of the Bulge. To put those men, on both sides, in the field was an enormous effort of collaboration. We can say it was doomed from the beginning, in hindsight, but it was very dangerous at the time and took enormous efforts to disengage troops and redeploy them. Patton's redeployment must be one of the greatest organisational feats in history.
At the beginning of the Battle the weather was terrible, stopping the normal collaboration with the air force. When the weather cleared, collaboration restarted, and both arms could work together much more effectively than the army alone.
Can we start appreciating and respecting other people's professional experiences without dismissing and criticising them?
It's well written and brought to light a very interesting subject, e.g. "Marshall’s research showed that just 15-20% of riflemen in active combat positions ever fired their weapons".
A lot of process and management is about dealing with low performers - by which I don’t mean incompetent people but people lacking motivation, or with the wrong intuitions, etc. Our hiring process can’t reliably filter out low performers, and when they get in it’s difficult to fire them, so we invent ways to raise the bottom line through processes.
And FWIW I don’t think you can solve this by always hiring the “best” either, at least not beyond a certain team size.
Maybe "teamwork" is bullshit, but that's only one way to do collaboration. Specifically, it's the hierarchical way. Usually, this is referred to as "participation" or "corporation", while "collaboration" and "cooperation" are used to describe an anarchist approach.
> Collaborating means the failure belongs to the process.
This is the way that hierarchy fails to scale. The larger a hierarchy, the more "process" must exist to keep it together. Process in a hierarchy must be defined by superiors, and implemented by inferiors, so it is superiors who must own the failure of process, and inferiors who are blamed for it.
> The average knowledge worker maintains accounts across system after system, switching between applications hundreds of times per day.
This is the more serious problem that comes from hierarchy. Work done by a team must become its own isolated context: the project. Anything anyone hopes to be able to do with a computer must be monopolized somehow by an "application". Why? An application is the only practical goal that a project can have. Anything more would have too broad a scope to be manageable.
---
We don't need hierarchy. There is another way to do work, and despite making incredible the tools for it, we have barely scratched the surface.
We have a decentralized internet, email, and git, so why do we keep making applications? Application is not only a reflection of the hierarchy that makes it, it's also a reflection of the environment. No matter how you contribute to the puzzle, your contribution must be a piece. How else could it fit with the rest?
Free software has been struggling with this dichotomy from the beginning, but it's only getting worse. Most of the systems we use are becoming ever more consolidated and impositional. Most people don't configure their system by editing each of the unique config files: they open the GNOME/XFCE/KDE "system settings", and expect it all to stay consistent. Most of what we actually do with computers is facilitated by one of two major web browser engine implementations. Want to make a new window manager? Get ready to build a feature-complete Wayland compositor (probably leveraging the bulk of another compositor's code). Sure, you can use shell utilities, but it's not like we are doing anything new or interesting with them: just transforming text with a careful emulation of a 50 year old environment.
---
There's no clear way to resolve this situation. Software is useless until it can fit as a piece in the puzzle. If we want a system that is not a puzzle, then wouldn't we have throw away all the precious pieces?
Even so, I think we have really lost touch with the original magic of computing. All these isolated contexts are walls that stop us in our tracks. Formats, accounts, applications, frameworks, platforms... all dead ends. Maybe it's time we make a path that doesn't end?
>what are we actually producing and who is actually responsible for producing it?
I don't even think in this age it's a "collaboration" issue. We've seen years of mass layoffs at this point and there's little rhyme nor reason. Sometimes being the "producer" saves you, but not always. Hard work isn't rewarded and leverage isn't necessarily respected anymore.
Your best bet these days is "collaborating" with someone high up who can shield you. Not because you're a producer, but because they like you. The illusion of meritocracy has completely collapsed (at least in large companies).
I think our processes are terrible as an industry. I have brought this up many times but we don't understand what actually works when something goes right and what failed when something went wrong. Adding to this is that engineers love tools and process so they tend to credit tools and process with success because we like the machine. Giving it credit where credit wasn't due leads to slowly growing more elaborate process and tools over time. This love of tools and process is a fundamental flaw in our culture and it is a big part of why big teams fail and small ones can get things done.
There are two fundamental truths to software, or any real organizational level problem. First, you don't know what the solution is until you have actually built it and are using it and second designing and building something is a non-polynomial growth problem.
The first part of the problem we sort of get, sometimes. The solution is iteration for the same reason it has always been. Assess, step, assess, step isn't just a good way to train a NN, it is also a great way to do pretty much anything where you don't know the optimal solution. Take the gradient of the situation and then take a right sized step in the right direction. Think you can have a perfect design before you start coding? You are basically saying you can take one big step from the start to the end. Either you have a small problem to solve or you are deluding yourself. Successful software is iterative. It always was and always will be. If your retrospective says things like 'if we had just done X from the start' be very careful because you are falling into the hindsight trap. You really couldn't have known X was the right thing. There is a reason you didn't see X. Just accept the iterative nature and own it. Try for appropriate step sizes, do good regular assessments, keep the iterations tight and you will probably be ok.
The second problem, NP growth, is where things really fall off the rails though. People get iterative, they see it work, even if they don't understand what they are really doing, but NP complexity growth is a real killer. The problem is that it actually IS true that if you took more time and put all the pieces together and solved it all as one problem you technically could eventually find the better solution. But more than likely the heat death of the universe will catch you before you do. Oh, yeah, and the total information storage needed to document the combinations tried will likely kill you too. There is only one good solution to NP growth, accept a local minimum and divide and conquer.
NP complexity growth is the foundational problem that needs to be attacked and the why things work or don't. Even more than iterative in many cases. As a problem grows its complexity, the possible number of solutions to check, grows in an NP way. The only solution is to drop the number of options to consider. You have to divide the problem and admit a local optimum is the best practical solution. People -sort of- get this by pretending to break the problem up and give it to different people or teams but then totally blow it. Jira is an example of totally blowing it. So you broke the problem down and you broke the teams into smaller pieces to address those sub problems but then you threw it all in one place again in Jira and you had all the teams in the same standup. You can't do that. That is the point of divide and conquer. You do that and you get lost because the problem just got too big again when you put all the pieces together. Also, communication scales up with people, even without problem size changing. Create too big of a team and the communication eats all the available work. Divide and conquer -requires- not communicating, or at least being exceptionally careful about how you communicate between problems.
The processes and tools we have created and love to use so much are the heart of why things don't work and we need to start admitting that. They give us a false sense that we can make a team bigger or take a bigger problem on. That is a mistake.
If you have done a good job of dividing a problem up, and correctly sized teams, then you have created problems that are clear enough not to need status boards and the like. Sure, go ahead and use them if your small team likes that. Be my guest, but you probably shouldn't. If a team is iterating on their problem and the problem is appropriately scoped then the team knows the state of their entire piece so well that the status boards slow them down. Why put in a jira ticket when you can just deal with it? Why break your internal team communications like that? Team management and project management become easy with small teams since your options are limited and the problem is small so it is all obvious. If you are saying to yourself 'well how will we know the whole thing is on track' well if you divided correctly then every level has a human sized understanding to deal with and is keeping track of their piece. That includes the team that owns teams. They should have designed the teams working for them, and the problems those teams are dealing with, in such a way that the working memory state is enough. They also designed the communication to that team in a way that they stay informed -without- joining that team and in doing so joining all teams. In other words they don't micro manage because that breaks divide and conquer. If any level is lost then the problem may not have been broken down well or has changed. A good iterative team catches this and raises the flag quickly so the divide can happen again if needed. The team leading the team has the job of monitoring to help figure this out, but monitoring in very limited ways so that they don't end up micro managing and collapsing the divisions.
A good military know this and a bad one has forgotten it. In WWII we had task forces for everything. They could stand up a TF, get it training so that it was a coherent entity, execute the mission needed and tear it apart. We were amazing at it. When WWII ended we did big things because we carried our understanding of the operational level of war, how to break apart problems and teams, into industry. We went to the moon. Now however we have standing task forces in the US military that are essentially the leftovers from WWII. We crate new task forces, badly, that are really just the existing ones renamed which means they have their old job and new job and nothing has really been broken out and isolated correctly. We suck at war and a big reason for this is that we have forgotten the operational level of war lessons from WWII.
This is a long rant to get to this final point. The author doesn't get the real reason why '20%' does the work. It is because we hire and create massive teams that can't get anything done because their communication has scaled to 1000% of their capacity. So, naturally, a small core team forms that can effectively communicate and get a job done, by ignoring he other 80%. It isn't the other 80%'s fault, it is the organizations fault for not breaking things up and creating small teams where the size of the problem is understandable and actionable and, most importantly, not re-merging the problem and the teams with stupid things like Jira boards.
The real solution is the same set of solutions that work time and time again. Create small teams. Give them clear problems to solve and the right tools and authority to solve them. Put bounds on what they should be doing so they, and you, don't get distracted. Understand that a problem is an evolving iterative thing and lean into that. If 80% of your workforce isn't doing things then your organization is broken. Start figuring out how to fix it. Collaboration isn't bullshit. It is fundamental. We just need to actually, intentionally, design that collaboration based on the actual things that shape it. NP growth and iterative understanding.
What everybody keeps forgetting over and over again is that software is super complicated even if it can be changed from a keyboard without the use of physical morphing tools.
People who do not themselves generate software are in the position of telling the people who generate software how to do it and what the constraints should be on the outcomes.
Accept that it is complicated and that you cannot know in advance when it will be done unless it is a super simple request.
It is indeed more like oil field exploration than it is like sweeping the floor.
You cannot really know where the solution to a complicated problem lies in advance and therefore you cannot predict how long it will take you to find it.
People on the finance side just need to face the fact that there is risk that cannot be eliminated in advance or even quantified particularly accurately.
If your investors cannot stomach this, they probably need to invest in something other than software development.
Article is a thought leadership style piece. By commenting on business you pedistal yourself and can charge for consultancy or even (as in this case) to read premium content. The thought leadership style is generally opinionated and matter of fact. They are, after all the expert.
The core issue is that collaboration bullshit is fantastic for mediocre people who cannot produce anything of value and as the team grows, the share of mediocre people will inevitably grow. This is why every single large organization turns into a theatre of processes.
This is a frustrating (bullshit?) blog post because it starts out poorly, gets really good, and ends on a whimper. It has no advice to offer other than than "leave me alone, I'm doing things".
High performing collaborative teams and teams-of-teams have the ownership culture that he is describing. But they also have a team level view of progress (kanban is one approach, and not a bad one, story backlogs are another, etc.) because any large initiative requires dependency management, throughput flow visibility, and coordination across teams.
Yes, individual small teams with autonomy perform best and should have minimal hard dependencies on other teams. "one piece continuous flow" is similar what he's describing as the optimal team flow with minimal waste, which is what Toyota sought to reach in as many teams as they could. Kanban and such approaches for signaling queues and jobs was a patch when it wasn't possible to get "one piece continuous flow"
But in complex products and projects .. dependencies, uncertainty in requirements, design, knowledge, etc. lead to queues, and thus a need for visibility of the queue. this requires active management of priorities, timelines, risks, and resource allocation, so that it isn't just a blind exercise of "trust me bro".
Progress is also best described as "jobs to be done" (whether user stories, or kanban tasks, or whatever) rather than lines of code or other poor metrics. .. learning things is a job to be done... iterating on design is a job to be done... if these lead to new tangents, restarts, cancellations, code removal, or elimination of bad components, sub-projects, or approaches is just as valuable in the long run as creating new things. All of that requires "collaboration".
I think the issue isn't "collaboration" it is poor "management" and process to organize humans for results.
All of the collaboration artifice that the author is referring to seems to me to always be a futile attempt to meet "the date". That software development might itself be _inherently_ unpredictable is never even considered, even though there are a lot of reasons to suggest that it is: by definition, the software you're developing has never been developed before, or else you could just use the thing that already exists.
I had a glimmer of hope in the late 90's when the agile manifesto was published - everything about it seemed to me to read "software development activities can't be coordinated like a wedding banquet can, but you can at least make sure that everything is tracking toward a shared understanding". I guess I shouldn't have been surprised when "Agile" became "tell me exactly what you're going to do and how long each step will take" almost the instant of its inception.
They'll forgive you if you're slightly late, they'll hate you forever if you ask for more money.
Agile works really well if you have a good product owner that has secured appropriate budget for the level of uncertainty in the endeavor & can make decisions and not be overridden by extrinsic forces. Everything else is negotiable.
Often your salary is not on that budget, so if it takes you twice as long but you don't have to buy/hire/use AWS, winner.
I think the smaller the organization, the more likely that a software projects has real stakeholders. In bigger, more mature organizations, the experienced players have arranged their affairs so that their career progress doesn’t depend on delivery of software: Late, early, or ever. For instance I work on the “hardware” side of technology development, and I tailor my annual performance review goals so that a deliverable is satisfied when I can demo it with code that I’ve written myself.
Hah! You just gave me an idea for a new methodology. Date-bound delivery.
- The business tells you what they want, as they do
- The business tells you when they want it, as they do
- The team does not say how long it will take. Instead, they say what they think they can deliver in the time allotted.
- As the date nears, more edge features get trimmed
- As the date arrives, something is always ready to deliver, no matter how miniscule
Such a methodology would ensure delivery, but not necessarily the contents of that delivery. Post mortems would no longer discuss why something took so long, and instead would focus on why features were cut.
If, as you say, the date is always more important, wouldn't such a methodology be worth trying?
every week, something is delivered, and is demoable, with approved tests from the business. That thing represents the most important thing to the business relative to the risk prioritization from engineering & usability prioritization from design.
every week, priorities can adjust, etc. and the cycle continues. hitting the actual 'release date' becomes much more knowable when you see the tangible date-driven progress on a regular cadence.
The business does not care about week long deadlines. They need something on May 23 so they can achieve _______.
My understanding of Scrum (not representative of all agile, I know) is that the velocity is supposed to be tracked and used for better predictions. In my experience this takes a very dedicated core of people who are intent on making it happen. In other words, usually it doesn't happen.
But date-bound delivery is already our default mode of operation. We just don't like to admit it. We are going to deliver something on this date; we just don't know what, yet.
However the point of the weekly cadence is that the business does care about adjusting scope and priority towards hitting that deadline on May 23, so that they know what they're going to get on May 23 and have the power to adjust it.
Especially if the goal of what is delivered on that date is not clearly defined. It almost never is.
Most projects can be summed as "give me $X, I'll come back in 6 months, and ask for more time and money". or "here you go"... "that's not what I wanted".
It's a key risk mitigation toward a hard date to know every week if you're still getting what you wanted.
Velocity is overblown as a metric. It's one metric among many that can signal a few things (e.g. quality problems because bug fixes are overtaking features) but isn't as much of a lever as some say.
I have a mixed relationship to it, but the scope cutting part of it works extremely well.
The focus it brings on focusing on the problem solved rather than on the concrete solution is also healthy I feel.
That meant, that as the inevitable schedule crunch arrived, the things that were tossed in the skip were not important.
I call it "Front of the Box/Back of the Box." I basically got the idea from The Simplicity Shift[0].
[0] https://jenson.org/The-Simplicity-Shift.pdf
Then they proceed to implement the solution in 30”.
In short, he proved that even AI agents exhibit all the dysfunctions one would normally attribute to human shortcomings / politics / laziness etc.
Either way, I think the point is strong: if the organization is bad, you end up doing mostly work about work which is exhausting.
Small, effective teams with super high accountability are more fun, but don't look "reproducible" or "repeatable".
Shameless plug on my take: https://www.menge.io/blog/where-to-cut/
IMO, "ish". You can reliably and repeatedly produce good teams _if_ you reliably and repeatedly invest in your people.
IMO, what's really happening is that small, effective teams aren't _fungible_ - you can't just swap people around without breaking the magic in a team, and you can't just move a team around an organization without similarly breaking the magic (although the latter _is_ way more possible).
IMO, it's sort of an organizational version of "context switching". It takes time for a team to get up to gel and get up to speed. If you're swapping out team members, you break that cohesion. If you move around teams, you (somewhat) reset that "getting ramped up" process.
I wonder if that made it into the training set intentionally, or just as an unexpected side effect of stealing every character of text available on the internet with absolutely no curation?
Teams are how you do big stuff. I’m really good at what I do, but I’ve been forced to reduce my scope, working alone. I do much smaller projects, than our team used to do.
But the killer in teams, is communication overhead, and much of that, is imposed by management, trying to get visibility. If the team is good, they often communicate fine, internally.
Most of the examples he gave, are tools of management, seeking visibility.
But it’s also vital for management to have visibility. A team can’t just be a “black box,” but a really good team can have a lot of autonomy and agency.
You need good teams, and good managers. If you don’t have both, it’s likely to be less-than-optimal.
They could review PRs and commits and specs to get visibility and reduce comms overhead, if they had the skills and time.
The non-technical manager also takes great conveniences in making technical people spend their time translating things. But no one ever asks the manager to learn new skills as much as they make developers do it.
Otherwise yeah there’s really no point.
literally everything else is work off the kanban board or backlog.
in my teams everyone was told to decline all meetings unless it explicitly led to the completion of a weekly planned story/task. this way all meetings for the team have a clear agenda and end in mind.
for mandatory external meetings & running interference with external parties, there are ways to insulate the majority of the team from that.
If the Scrum Master or whatever their title schedules any other repeating process meetings, fire them.
On my second team, the visibility theater took over, upper management set and reset and reset and reset our direction, and nobody was happy. In retrospect, I should have said no immediately. Trusting and empowering your people is hard to beat.
That's why the most effective teams are wolf packs - roughly 6-10 highly performant members where communication overhead is still low enough that it barely matters, but have enough people to be way more productive than an individual.
Obviously there's a minimum level of competence you need to have for this to work. The smaller the team the less freeloaders are tolerated.
You want to break a team of 10 in half if you can. Not always easy. But if you can manage it, do it.
I think the author has identified that most organizations both fail at effective collaboration, and also use collaboration to paper over their failures.
I think the author maybe over-corrects by leaning on the idea that "only small teams actually get stuff done", and honestly I don't think anyone should be using SLA Marshall/Men Against Fire as an analogy for like... office work (if nothing else, even if you take his words at face value, then the percentage of US infantry who fired their rifles went up from 15-25% in WW2 to ~50% in Korea due to training improvements), but I can get behind the idea that a lot of organizations are setup to diffuse responsibility.
I also do think it's interesting to think about building the Pyramids. For the vast majority of people involved... I don't think modern audiences would call their work relationship or style "collaborative". Usually we use "collaborative" in opposition (at different times) to "working alone", "working with strict boundaries", and "being highly directed in what to do". Being on a work gang, or even being a team foreman is very much "no working alone", but those were also likely highly directed jobs (you must bring this specific stone to this specific location by this time) with strict boundaries.
The author says, "The collaboration industry has spent a fortune obscuring a dirty truth: most complex, high-quality work is done by individuals or very small groups operating with clear authority and sharp accountability" which means collaboration can work... in the right environment and with the right people. I work in R&D and I could not imagine not working in a collaborative environment. It's not reasonable to have expertise at everything and it's understood that things have to get done no matter whose name is on the ticket/story.
I also agree on you calling out Men against Fire example as well. That's not a collaboration issue, that's a training issue (amongst other things). And that problem went away as you said.
> By 1946, the US Army had accepted Marshall’s conclusions, and the Human Resources Research Office of the US Army subsequently pioneered a revolution in combat training which eventually replaced firing at ‘bulls eye’ targets with deeply ingrained ‘conditioning’ using realistic, man-shaped ‘pop-up’ targets that fall when hit. Psychologists know that this kind of powerful ‘operant conditioning’ is the only technique which will reliably influence the primitive, mid-brain processing of a frightened human being. Fire drills condition terrified school children to respond properly during a fire. Conditioning in flight simulators enables frightened pilots to respond reflexively to emergency situations. And similar application and perfection of basic conditioning techniques increased the rate of fire to approximately 55 percent in Korea and around 95 percent in Vietnam.
This piece is more of a whine about a certain kind of office culture, which the author - unreasonably - generalises to collaboration as a whole.
There's likely a lot of money to be made by identifying and defining good vs bad collaborative cultures.
Both are real. But a lot of "good" practices are more cargo culty than genuinely productive, and the managers who really do make it work seem to get there more by talent and innate skill than learned effort.
It's not a linear scale. A lone wolf can't produce the latest Assassin's Creed game. A committee can't produce Stardew Valley or Balatro. They're different capabilities, not a simple matter of more/less.
What organization, skills, leadership is required to explore a jungle for gold is very different from what organization, skills and leadership is required to run a gold mine.
So we get explore-exploit tradeoffs, satisficing vs optimizing choices etc.
Laid off from a startup and moved fo corpos did gave me perspective,the first year working with the team works really well, we managed to get a lot of stuff really done and business were very happy.
And there came the Agile Coaches telling us to "Collaborate" while disguising as a need to serve his own agenda ( as he's also a PO for another squad ). So workshops on Collaboration, Explicit Expectation on PM have all authority and controls PO, for 8 freaking months just to get a competent team to work with a junior team with no agency nor even willingness to be mentored or do anything. So somewhat this incidentally aligns perfectly.
Corporate always manage to hire incompetent people, not firing them, and let others over-compensate for their failures, so yeah, its not really obvious but its there.
I believe the good collaboration can happen, but when people actually go of their ego and start listening and actually doing the work.
The answers to basic questions like that already starts to shape behavior. If you pay zero attention to how people behave, and only look at impact of what was delivered you may promote people who optimize for their own work, but make others miserable. If you don't properly weight quality, especially now with AI code gen, you'll promote people who move fast break more things than is reasonable.
We can easily find examples of suboptimal behavior that arises out of poorly shaped rewards incentives at companies. Empire building is one behavior that is the result of managers getting promoted based on headcount. Stack ranking can and has led to people limiting collaboration with peers because someone has to fail in order for someone else to get a favorable rating. Or people avoid riskier work because failure can put you on the hot seat.
By default in the dominant culture, most systems come down to individual incentives for individual drive and shame dynamics for collective drive, and that covers a decent chunk of how people are motivated, but leaves out people who are motivated differently and actively harms people for whom these are paralyzing.
others need to fill a gap -- the "insecure overachiever" demographic.
https://www.bbc.com/worklife/article/20180924-are-you-an-ins...
how do align ruthless sociopaths, gropy / rapey executives, angry mother hens, and phone-it-in interns?
The problem with this is conflating *output* for *impact*. A team of lone wolves writing 1k LOC by the hour is good output but not necessarily good impact.
A team with higher coordination overhead and "structural support" will probably have lower output, but if it focuses on significantly higher leverage activities might just have a better impact. The key question is whether that impact is visible and understood (often not) and lots of businesses are bad at understanding leverage.
I see this lone wolf BS from lots of founder types who mistake their own grind for real performance, often missing their own blind spots. I don't disagree that the 80-20% rule comes up and that some people have an outsized contribution overall compared to others, but to say that collaboration is dead as a result is throwing the baby out with the bath water.
> The collaboration industry has spent a fortune obscuring a dirty truth: most complex, high-quality work is done by individuals or very small groups operating with clear authority and sharp accountability, then rationalized into the language of teamwork afterward. Dostoevsky wrote _The Brothers Karamazov_ alone. The Apollo Guidance Computer came from a team at MIT small enough to have real ownership, hierarchical enough that Margaret Hamilton's name could go on the error-detection routines she personally designed.
Contrast this with the claims of “democratizing knowledge” and the image of a utopia where everyone contributes original work into a black box and expects no credit and no compensation in return (in fact, happily paying for the privilege of using it).
We, humans, like to have created something worthy of kudos. We pull the rope less hard when it’s a collective effort than when the rope is just yours alone.
Collaboration between us is the default (no one exists in isolation), but forcing a particular sense of collaboration onto people is a different thing.
So while the article you linked isn't confused on the subject, and I doubt Marshall was mixing support personnel in with front-line soldiers in his numbers, I do wonder whether there are people who confuse those two numbers: the number of soldiers, sailors, coasties, airmen, or marines who would never be in combat even during times of war, vs. the number who would actually be in combat and not fire.
(The article did address "what if the battle never came near where those particular soldiers were standing?", which was the other question I wondered about).
Not to mention the fact that this was a time of much more serious discipline issues. People were executed for desertion, and despite that many people did. There was also much malingering, up to and including literally shooting oneself in the foot. Is it so hard to believe that some people just hid when battles came?
Id be very surprised to hear from the other person that by Vietnam they had gotten it up to 95% though. My impression was that the most effective move away from this sort of thing was the move to a professional volunteer army, no conscription.
0 - https://en.wikipedia.org/wiki/On_Killing
it relies on SLA Marshall's dubious work, and several other examples it uses are difficult to take seriously.
it's similar to Freud, where there are shreds of truth but not really universally true or applicable.
Yes.
> deadly to an organization is the collective impulse to avoid giving those people credit when it's due.
No, in fact most office jobs operate this way in the world.
> Dostoevsky wrote _The Brothers Karamazov_ alone. The Apollo Guidance Computer came from a team at MIT small enough to have real ownership, hierarchical enough that Margaret Hamilton's name could go on the error-detection routines she personally designed
I have good news for you, my jaded friend! What is similar between those people and you? You’re an individual! Therefore you could write another masterpiece yourself, you can be next Notch, next copyparty guy, next Stardew Valley guy and a long list of creations created by an actuallly high-performing individual, not some complainer who is oh so encumbered by stupid social dancing.
Yeah but you'd think not dying involves killing those who want to kill you, or at least shooting at them! Isn't it super interesting to learn that 80% of riflemen don't ever shoot?
b) even if it was remotely true, context matters. Refusing to shoot someone point blank because of reasons is one thing, refusing to go against Tiger 2 is another.
Or it gets stuck in code review cause one colleague likes nitpicking everything endlessly, so you’re stuck changing working code for multiple days.
Or they have questions and want to spend 2-4 hours in a meeting about design and how to do development “better”, bonus points for not writing anything down for future reference, them expecting you’ll keep a bunch of rules in mind. No ADRs, no getting started guides, no docs about how to do deliveries, probably not even a proper README.md, or versioned run profiles or basic instructions on how to get a local DB working (worst case, everyone uses the same shared instance).
Even more points for not even having retrospectives and never looking at stuff critically - why people keep creating layers upon layers of abstractions and don’t care about the ideas behind YAGNI/KISS. More so, no actual tooling to check things (e.g. code style, but also tools to check architectural stuff, and also obviously no codegen to deal with the overly abstracted bs).
It all depends on the project and team a lot. Some people have only had the fortune to work in locales and environments where stuff like that isn’t commonplace but rest assured, it can get BAD out there.
Working in a good team can be better than working alone, sure!
But working in a bad team is certainly worse than working alone.
Especially so when seniority is measured in years or nepotism and you’re told to not rock the boat and shut up cause “we’ve always done things this way”. I'm exaggerating a bit here, but I’m certain that plenty of people work in conditions not far removed from that.
Collaboration isn't a process or a management technique -- it is a communication style. If you want collaboration, you can't take random people and use process to "make them collaborate" -- you need to build your team out of people who are collaborators.
Furthermore, collaboration is not at odds with accountability. Most of the highest performing collaborative teams I've ever worked on have people who are each individually highly accountable for their own contributions, and that's a critical part of what makes them a valuable collaborator.
Yes! I would add that IMO the communication style can be learned and there are great rewards for doing so.
I believe the rough statistic that 20% of people on a typical project are contributors. I don’t believe that it’s because the other 80% are losers. IME it’s because no serious effort has been made to include them, make sure they understand wtf is going on around them, and help them solve whatever is holding them back.
If you do this, a) it does work, and b) the need for small teams becomes apparent because the now-onboarded person can’t find anything that isn’t already being worked on, so they (with encouragement) start a new thing. And there are limits to people’s ability to understand what’s happening, especially if they’re inexperienced, and some people really don’t have the skills to contribute, but by and large, building bridges for people is still highly worth doing.
Something like UBI with extra steps.
And perhaps the bigger issue to get over, there perhaps ought not be a moral component to this, in a world where technology + a small number of people can easily take care of ALL actual needs.
My current employment is the first time I haven’t see this, the Pareto Principle.
The reason for this in biology is two fold:
1. Growth is distributed evenly as necessary to fill the containing system whether it is employment numbers or peas in a pod.
2. Resources are distributed to where they are demanded. Higher productivity individuals will consume a disproportionate number of tasks to complete as well as available resources. This difference is often statistically insignificant as a base difference but after compounding 20% of individuals account for 80% outputs and inputs.
Human behavior accounts for the same compounding because numerically growth distinction is similar enough.
What's depressing is that it's like Fred Books' book never happened: most managers think the way to solve IT problems is just to trow more people / more money at it until it gets solved; and they're all surprised when it doesn't work, but try again the next time anyhow.
If you get a bunch of people in a room and ask them for a design, one person is going to write the design while everyone else gets in the way. That's simply the nature of groups. The one person who writes it isn't even necessarily the best designer—they're just the one most willing to grab the whiteboard marker.
Conversely, if you ask one person to produce a preliminary design, they can leave, gather requirements, do research, produce a plan, and then convene everyone in a room to review it. Now all the abstract hypotheticals have been put to bed, the nebulous directionlessness has been replaced with a proposal, and the group can actually provide useful feedback and have a discussion that will inform the next draft of the design. And once the design is finished, everyone can easily work together to implement it as written. Collaboration is great, after someone has proposed a design.
That's part of what I like about the idea of Amazon's "culture of writing," though I've never worked in an environment like that in practice. Every idea needs to be preprocessed into an actionable memo before anyone tries to have a meeting about it.
That's before we even think about all the consultants and similar roles where busywork really is work. Then all the organizational or agile roles.
The fact that some product gets shipped and we still have customers is good, because that's what pays for it all, but that is just the foundation we all rest on. Almost like background noise.
Linux and Wayland.
Both are collaborative efforts, one has fairly effective and tyrannical leadership with the best interests of the community in mind. The other is lead by committee with competing interests and goals where they all have veto power.
Those same collaborators are reflected in the distro situation... Here is a group that also has some rather tyrannical leadership but they have dependencies (see the software they run) and some of those folks are sick of the distro's maintainers nonsense, and went to things like flat packs (see Bottles for an example).
> most managers think the way to solve IT problems is just to trow more people / more money
Leadership vs Management, a tale as old as time.
Collaboration sucks - https://news.ycombinator.com/item?id=45892394 - Nov 2025 (248 comments)
In such scenarios nobody wants to stick their neck out at all, everyone hates everyone else.
At a higher level the usual problem is with incentives being different from one team to another. If you want something done you have to start with the incentives rather than expect people to work against them and there does have to be leadership to break deadlocks.
Another example: bugs that are not found by testers - whose fault is that - development or test?
Clarity is just another way in which one person or group try to lay blame.
that is a meaningless buzzword salad masquerading as a deep insight
Im not saying teamwork is not needed, but it for sure should not be #1 indicator for teams.
Vetting for character to skip dycks is till fine in my eyes.
Apparently the Taliban find office work far worse than being fighters, so you never know!
now there's pretense of doing work. if you're smoking, you tend to shut up and listen to others.
email though good it terms of faster communication, enabled the whole email chain thing to fake as if one is doing work.
If folks don't clearly define the finish line, than a project will run out of budget eventually as scope creep turns it back into the same useless mess.
Proper restructuring usually means firing people that do not recognize they are in a business setting. The Big Brain fallacy is a common bias with people too blind to recognize what other professions bring to the table, and ultimately are unproductive in a large project.
Every manager should read Moneyball before building operational teams with HR, and avoid the trap of the costly Big Brain =3
"Moneyball: The Art of Winning an Unfair Game" (2003, Michael Lewis)
https://www.amazon.com/Moneyball-Art-Winning-Unfair-Game/dp/...
https://en.wikipedia.org/wiki/The_Tyranny_of_Structurelessne...
This is definitely how it can feel sometimes, but it's simply not true. The problem really is just poor communication and big knowledge gaps.
I do understand that not all workplaces allow for enough discussion. Arrogant behavior from leadership is just them cracking under the pressure. You should know that you're never the only one noticing it.
Don't get dragged down with them. You can voice your concerns candidly with the right people who care the most about the outcomes and let the chips fall where they may, or you can suffer in silence like they expect you to. It's sad that this is the expectation because these conversations are just as much a part of "collaboration" as anything else. I expect there may be some defensive replies from certain types of people who feel threatened by this idea.
What you should never do is let this stuff get under your skin or take it personally. The fact of the matter is, teamwork is the nature of any business. When people go rogue, it only makes the problems worse. Everyone misses out on opportunities to grow and the project suffers from the lack of coordination to continue long term.
The only reason weak management is allowed to exist is because people never speak up. If you're the kind of person confident enough to speak up, you might be let go but will almost certainly find something better too.
If you're not that kind of person... well then fine go ahead and suffer. Is what it is.
At the beginning of the Battle the weather was terrible, stopping the normal collaboration with the air force. When the weather cleared, collaboration restarted, and both arms could work together much more effectively than the army alone.
and i feel that it's a much better way to work.
whenever i need real users feedback, i just ask my wife next to me to test out the features and give end user feedbacks
It's well written and brought to light a very interesting subject, e.g. "Marshall’s research showed that just 15-20% of riflemen in active combat positions ever fired their weapons".
And FWIW I don’t think you can solve this by always hiring the “best” either, at least not beyond a certain team size.
> Collaborating means the failure belongs to the process.
This is the way that hierarchy fails to scale. The larger a hierarchy, the more "process" must exist to keep it together. Process in a hierarchy must be defined by superiors, and implemented by inferiors, so it is superiors who must own the failure of process, and inferiors who are blamed for it.
> The average knowledge worker maintains accounts across system after system, switching between applications hundreds of times per day.
This is the more serious problem that comes from hierarchy. Work done by a team must become its own isolated context: the project. Anything anyone hopes to be able to do with a computer must be monopolized somehow by an "application". Why? An application is the only practical goal that a project can have. Anything more would have too broad a scope to be manageable.
---
We don't need hierarchy. There is another way to do work, and despite making incredible the tools for it, we have barely scratched the surface.
We have a decentralized internet, email, and git, so why do we keep making applications? Application is not only a reflection of the hierarchy that makes it, it's also a reflection of the environment. No matter how you contribute to the puzzle, your contribution must be a piece. How else could it fit with the rest?
Free software has been struggling with this dichotomy from the beginning, but it's only getting worse. Most of the systems we use are becoming ever more consolidated and impositional. Most people don't configure their system by editing each of the unique config files: they open the GNOME/XFCE/KDE "system settings", and expect it all to stay consistent. Most of what we actually do with computers is facilitated by one of two major web browser engine implementations. Want to make a new window manager? Get ready to build a feature-complete Wayland compositor (probably leveraging the bulk of another compositor's code). Sure, you can use shell utilities, but it's not like we are doing anything new or interesting with them: just transforming text with a careful emulation of a 50 year old environment.
---
There's no clear way to resolve this situation. Software is useless until it can fit as a piece in the puzzle. If we want a system that is not a puzzle, then wouldn't we have throw away all the precious pieces?
Even so, I think we have really lost touch with the original magic of computing. All these isolated contexts are walls that stop us in our tracks. Formats, accounts, applications, frameworks, platforms... all dead ends. Maybe it's time we make a path that doesn't end?
50% of the work is done by the square root of the total number of people who participate in the work.
I don't even think in this age it's a "collaboration" issue. We've seen years of mass layoffs at this point and there's little rhyme nor reason. Sometimes being the "producer" saves you, but not always. Hard work isn't rewarded and leverage isn't necessarily respected anymore.
Your best bet these days is "collaborating" with someone high up who can shield you. Not because you're a producer, but because they like you. The illusion of meritocracy has completely collapsed (at least in large companies).
There are two fundamental truths to software, or any real organizational level problem. First, you don't know what the solution is until you have actually built it and are using it and second designing and building something is a non-polynomial growth problem.
The first part of the problem we sort of get, sometimes. The solution is iteration for the same reason it has always been. Assess, step, assess, step isn't just a good way to train a NN, it is also a great way to do pretty much anything where you don't know the optimal solution. Take the gradient of the situation and then take a right sized step in the right direction. Think you can have a perfect design before you start coding? You are basically saying you can take one big step from the start to the end. Either you have a small problem to solve or you are deluding yourself. Successful software is iterative. It always was and always will be. If your retrospective says things like 'if we had just done X from the start' be very careful because you are falling into the hindsight trap. You really couldn't have known X was the right thing. There is a reason you didn't see X. Just accept the iterative nature and own it. Try for appropriate step sizes, do good regular assessments, keep the iterations tight and you will probably be ok.
The second problem, NP growth, is where things really fall off the rails though. People get iterative, they see it work, even if they don't understand what they are really doing, but NP complexity growth is a real killer. The problem is that it actually IS true that if you took more time and put all the pieces together and solved it all as one problem you technically could eventually find the better solution. But more than likely the heat death of the universe will catch you before you do. Oh, yeah, and the total information storage needed to document the combinations tried will likely kill you too. There is only one good solution to NP growth, accept a local minimum and divide and conquer.
NP complexity growth is the foundational problem that needs to be attacked and the why things work or don't. Even more than iterative in many cases. As a problem grows its complexity, the possible number of solutions to check, grows in an NP way. The only solution is to drop the number of options to consider. You have to divide the problem and admit a local optimum is the best practical solution. People -sort of- get this by pretending to break the problem up and give it to different people or teams but then totally blow it. Jira is an example of totally blowing it. So you broke the problem down and you broke the teams into smaller pieces to address those sub problems but then you threw it all in one place again in Jira and you had all the teams in the same standup. You can't do that. That is the point of divide and conquer. You do that and you get lost because the problem just got too big again when you put all the pieces together. Also, communication scales up with people, even without problem size changing. Create too big of a team and the communication eats all the available work. Divide and conquer -requires- not communicating, or at least being exceptionally careful about how you communicate between problems.
The processes and tools we have created and love to use so much are the heart of why things don't work and we need to start admitting that. They give us a false sense that we can make a team bigger or take a bigger problem on. That is a mistake.
If you have done a good job of dividing a problem up, and correctly sized teams, then you have created problems that are clear enough not to need status boards and the like. Sure, go ahead and use them if your small team likes that. Be my guest, but you probably shouldn't. If a team is iterating on their problem and the problem is appropriately scoped then the team knows the state of their entire piece so well that the status boards slow them down. Why put in a jira ticket when you can just deal with it? Why break your internal team communications like that? Team management and project management become easy with small teams since your options are limited and the problem is small so it is all obvious. If you are saying to yourself 'well how will we know the whole thing is on track' well if you divided correctly then every level has a human sized understanding to deal with and is keeping track of their piece. That includes the team that owns teams. They should have designed the teams working for them, and the problems those teams are dealing with, in such a way that the working memory state is enough. They also designed the communication to that team in a way that they stay informed -without- joining that team and in doing so joining all teams. In other words they don't micro manage because that breaks divide and conquer. If any level is lost then the problem may not have been broken down well or has changed. A good iterative team catches this and raises the flag quickly so the divide can happen again if needed. The team leading the team has the job of monitoring to help figure this out, but monitoring in very limited ways so that they don't end up micro managing and collapsing the divisions.
A good military know this and a bad one has forgotten it. In WWII we had task forces for everything. They could stand up a TF, get it training so that it was a coherent entity, execute the mission needed and tear it apart. We were amazing at it. When WWII ended we did big things because we carried our understanding of the operational level of war, how to break apart problems and teams, into industry. We went to the moon. Now however we have standing task forces in the US military that are essentially the leftovers from WWII. We crate new task forces, badly, that are really just the existing ones renamed which means they have their old job and new job and nothing has really been broken out and isolated correctly. We suck at war and a big reason for this is that we have forgotten the operational level of war lessons from WWII.
This is a long rant to get to this final point. The author doesn't get the real reason why '20%' does the work. It is because we hire and create massive teams that can't get anything done because their communication has scaled to 1000% of their capacity. So, naturally, a small core team forms that can effectively communicate and get a job done, by ignoring he other 80%. It isn't the other 80%'s fault, it is the organizations fault for not breaking things up and creating small teams where the size of the problem is understandable and actionable and, most importantly, not re-merging the problem and the teams with stupid things like Jira boards.
The real solution is the same set of solutions that work time and time again. Create small teams. Give them clear problems to solve and the right tools and authority to solve them. Put bounds on what they should be doing so they, and you, don't get distracted. Understand that a problem is an evolving iterative thing and lean into that. If 80% of your workforce isn't doing things then your organization is broken. Start figuring out how to fix it. Collaboration isn't bullshit. It is fundamental. We just need to actually, intentionally, design that collaboration based on the actual things that shape it. NP growth and iterative understanding.
What everybody keeps forgetting over and over again is that software is super complicated even if it can be changed from a keyboard without the use of physical morphing tools.
People who do not themselves generate software are in the position of telling the people who generate software how to do it and what the constraints should be on the outcomes.
Accept that it is complicated and that you cannot know in advance when it will be done unless it is a super simple request.
It is indeed more like oil field exploration than it is like sweeping the floor.
You cannot really know where the solution to a complicated problem lies in advance and therefore you cannot predict how long it will take you to find it.
People on the finance side just need to face the fact that there is risk that cannot be eliminated in advance or even quantified particularly accurately.
If your investors cannot stomach this, they probably need to invest in something other than software development.
Good luck with finding that in 2026.
And if the job is ass like that just get a new job.
Or start your own company.
This article is just a complaint slop, and complainers are just as bad if not worse. Do something.
High performing collaborative teams and teams-of-teams have the ownership culture that he is describing. But they also have a team level view of progress (kanban is one approach, and not a bad one, story backlogs are another, etc.) because any large initiative requires dependency management, throughput flow visibility, and coordination across teams.
Yes, individual small teams with autonomy perform best and should have minimal hard dependencies on other teams. "one piece continuous flow" is similar what he's describing as the optimal team flow with minimal waste, which is what Toyota sought to reach in as many teams as they could. Kanban and such approaches for signaling queues and jobs was a patch when it wasn't possible to get "one piece continuous flow"
But in complex products and projects .. dependencies, uncertainty in requirements, design, knowledge, etc. lead to queues, and thus a need for visibility of the queue. this requires active management of priorities, timelines, risks, and resource allocation, so that it isn't just a blind exercise of "trust me bro".
Progress is also best described as "jobs to be done" (whether user stories, or kanban tasks, or whatever) rather than lines of code or other poor metrics. .. learning things is a job to be done... iterating on design is a job to be done... if these lead to new tangents, restarts, cancellations, code removal, or elimination of bad components, sub-projects, or approaches is just as valuable in the long run as creating new things. All of that requires "collaboration".
I think the issue isn't "collaboration" it is poor "management" and process to organize humans for results.