https://www.youtube.com/watch?v=rQKis2Cfpeo
"The greatest people are self managing, they don't need to be managed. Once they know what to do they will figure out how to do it. What they need is a common vision, and that's what leadership is. Leadership is having a vision, being able to articulate that, so the people around you can understand it, getting a consensus on a common vision."
"We were in stage where we went out and thought, Oh! We are going to be a big company, so let's hire professional management. We went and hired a bunch of professional management but it didn't work out all well. Most of them were bozos, they knew how to manage but they didn't how to do anything!"
"If you are a great person, why would you want to work with someone you can't learn anything from? You know what's interesting? You know who the best managers are? They are the great individual contributors who never ever want to be a manager but decide they have to be a manager because no one else is gonna do a job as good as them"
Programmers/developers are only effective if either the developer himself or enough people in the team have sufficiently deep domain knowledge.
That means you can only write accounting software if you are an accountant, in addition to a developer. You can replace "accounting" with anything else you want.
Software developers don't want to hear this because it means that being a developer is near useless : it allows them to express themselves in code but ... they have nothing to express.
Accountants don't want to hear this because it means no generic software developer (or firm) can deliver on the software they want.
The real bad news for software devs is this : you'll do a lot better as a bad developer with expert domain knowledge than vice versa. This is why Excel sheets and VBA macros can run for decades when great and easily maintained software cannot : the knowledge they were written with is what makes the difference.
Of course both situations are what you constantly see in the real world. Software developers just making software that doesn't support the function it was written for, and really, really badly written pieces of crap software that work amazingly well.
If you focus only on deliverables and deadlines, you'll end up with developers using a mix of different approaches, libraries and even languages. It hurts team cohesion and makes the logistics of project management much harder since Joe can't take over Tom's code now that Tom has the flu.
As I see it, one of the main tasks of a technical manager is to set conventions so everyone would feel at least semi-comfortable with everyone else's code. That's not micro-management, that's management.
This is actually an extremely inefficient and demoralizing environment for those involved. Yeah, it's easy to take over when someone leaves, because they were so hamstrung by the environment that they never built anything interesting. And you're going to be doing a lot of this taking over, because people are always leaving, because they were hamstrung. So this idea that we can't trust individuals to stick around and do a good job, and we have to make sure they never have enough power to do damage when they make a mistake, it's a self fulfilling prophecy. It makes them untrustworthy and drives them away.
It really is about ownership. Programmers who are achieving at a high level move much faster and do much greater things. It's worth letting them make mistakes to retain the best people and get their best work. The only catch is figuring out how to keep them accountable for their decisions. OK, you want to use this new tech or try this new architecture. How do we tie your compensation and career progress to the success of those decisions?
I'd be more inclined to say one of the main roles any manager is facilitation more than anything else. Any manager that "sets convention" is somebody who wants the easy part of technology without the corresponding hard part.
Talking about technical solutions is easy. Implementing it is much harder.
I've come across multiple "technical managers" who were hands on with code greater than x years ago and they always end up talking out of their arse. And that's not to mention the numerous wrong choices they've committed to/spoke about at a high level meeting with zero notion.
Common ownership does not end up with a random patchwork of technologies. Teams make collective decisions on them.
I might like to write CGI programs in Prolog with a MarkLogic db, I'm not going to unilaterally decide to write my bit of a team application like that when everyone else writes WSGI Python applications with Postgres.
They don't exist to make good people more productive but to make mediocre hires marginally productive.
To be honest, my own senses were numbed by being a yes-man at a corporate job. My job title made it sound like I ran the world, but much of the work I was made to do was simply idiotic. And I was very happy to toe the line because it was easy, you could always spread blame and I didn't have to think much. Then I realised I have a limited amount of time on this planet and decided to do something more useful with my time...
Another example I've talked a lot with my colleagues is company wide coding style definitions. They usually have stuff like "never ever use goto", but then you have Linux kernel code using goto, and it seems all weird. But here too the coding convention that feels "stupid" to the rockstar coder is in place to make the code of the summer intern even remotely usable after they are gone.
Maybe I am one of those people? The first time I hired a developer (contract, and senior contract at that) I had written a prototype of the software I wanted with the core features working but it was clunky. It was to interface a modern piece if software to a legacy system with a poor API. I gave him the code and said that I wanted that same thing, but written in a professional and maintainable way. I asked him to let me know anything he was not sure about, and I would document it further. 'Let's be Agile about it, we don't need to write heaps of documentation', he said. Apart from having to make him recover from various flights of fancy about new features I hadn't asked for, he kept blundering on with things he hadn't understood properly or he lacked the specific domain knowledge for. I had those things, and many were working in my prototype. Several times the project stalled due to a problem he couldn't fix and I recovered it with my limited (at the time very limited) coding knowledge. In the end we went live with his solution which never quite achieved its aims, but when the business found new requirements I couldn't face this again and wrote the whole thing from scratch.
One bad dev? Well he was a lot like many of the ones I met subsequently tbh. Far to eager to find the one vague thing in what you asked for and interpret it the wrong way. Far to quick to think that users should bend to fit the software and far too willing to plow on with code, when they should have been looking at a flow chart. The great development managers I have met are the ones that have spent considerable time exploring the domain they are working in, know how to talk to business people, and stop and ask when something is ambiguous. Sometimes you need to get out of the tech stack and think more in terms of processes.
My methodology is now something like this.
1) I write in plain English what I want (thanks Joel Spolsky)
2) I bullet point my definite requirements
3) I explain a process in simple flow chart blocks
4) I send this to my devs in good time
5) I sit down with them and explain it again, drawing charts as I go if necessary.
6) I invite and expect questions/challenges and note them down
7) I amend my docs and reissue it
8) I let someone else translate whatever I wrote into 'user stories' or whatever else they want to do
9) I test them against the requirements I first wrote, now I know that I have conveyed my meaning correctly
10) I meet with them regularly and take plenty of time to just talk through where we are. I like to have a mix of business people and engineering in the room, because it makes the devs talk in different levels.
11) I get into UAT with the least tech savvy people, who have no understanding of the project as soon as there is something to show them. Secretaries, clerks, call centre operators all find different faults than the tech people who don't do their jobs every day. They ask all the straightforward questions that you never thought of.
Sounds obvious, but I meet with a lot of third party agencies and developers who look like they have never seen anything like requirements from a client! I have had them say things like 'but we use Agile, we will collect our own user story' How dumb it that, 'we are the smart ones, we will cut out the people with domain knowledge and guess'! I tell them they can do what they like, but I will test their finished article against my original requirements when I am deciding if I am going to pay for what they did!
The main thread of this is to force as much human interaction between the developers and the business as possible, all the way through the project.
That still happens, but it seems the "cogging" of developers has largely made that a rarity as cheap offshore developers aren't expected to do that sort of thing anymore.
As an aside Spolsky should be required reading for any person who oversees any department that interacts with developers in any way. Most people think of development as a scientific endeavor, but it's largely artistic with mathematical tools.
The first job of a dev is to define the requirements with the clients. But most dev don't know or don't want to do that.
What do you do if someone says they don’t know how to estimate a particular task?
Sounds pretty much like:
> Individuals and interactions over processes and tools [1]
I don't know why, but somehow people don't get it that agile is not a methodology but a spirit.
Because of all the agile coaches, boards, trainings, conferences and companies. It feels then more like a religion.
Oh wow amen to this. I've started calling one of our managers "Reverend". Particularly when he begins a meeting with a statement that's starting to sound a lot like "we are gathered here today to...".
And this is not helped by the true believers constantly saying "you're doing it wrong."
So, point me at someone doing it right then! Because the landscape is coated with people "doing it wrong", and since I'm doing this as a job, I don't have time to sift through piles of pyrite for a single nugget of gold.
Even now, when we ask for something simple like a new Confluence board, we have to actively push back against new rules, additional restrictions, and more gimmicky Atlassian plugins. It pains me that these misguided parasites are paid to make my life worse.
A cult, more than religion. I believe "cargo cult" is the exact term.
The change to Agile is often led by management, not by developers. And when it's led by management, it's done in a way that keeps management central to the development process (which is the opposite of original Agile), which means an over-focus on process.
Part of this is cynical survival skills: management wanna manage. The more forward-thinking ones probably realize that original Agile is an existential threat, and they can 'get ahead of' that threat by controlling how it's implemented. But most commonly, it's just plain simple myopia: Process guys will tend to view Agile as a process because they're process guys. And when they implement it, they will make management of the process the central role in everything.
I've found the best "methodology" to deliver decent results are sticking with short iterations. Software is often about doing something we've either not done before, in a way we've not done it before, with people we've not done it with before. So we will have surprises (aka delays) on the way. The more frequent we check just what these delays are, the more realistic we can be about whether we can make it on time, or if we need to cut scope or pull in more help to make it on time.
This can be true but can also be completely false. Massive differences in productivity are possible depending on how individuals work together on a team.
Great teams can produce much more than mediocre ones, but they too have a limit. When deadline is set too close, one of these 4 things has to give, and it is good to know in advance which one that is, so team can set the priorities accordingly.
> Massive differences in productivity are possible depending on how individuals work together on a team.
Addressed by this:
>> Exceptions I've seen are with mature and well-bonded teams working on familiar scope they understand clearly, with a timeline they themselves defined.
?
https://en.wikipedia.org/wiki/Project_management_triangle
Time, cost and scope. Pick two.
Do you work with me?
I've been fighting this battle for the last 3 months. They've added almost an hour of meetings every morning at 9am. Half the office shows up at 8:30. That's enough time before the meeting to check email and get coffee, so essentially no work starts until 10. At 10 they get back to their desks, there's a bit of whining about whatever management has changed that day and how stupid the meetings are, some email correspondence and by 11 nothing is done and they go to lunch. They get back by 11:30-12 and finally start doing work. So your 8 hour workday turns into maybe 4 hours of real labor time, and no one works 100% of that.
The relogious people (mentioned in the article) harmed by false adherence. They adhered to the headlines and warped the substance of what the Methodology said. I remember (with pain) a place that wouldn't develop development scaffolding. They had rules for software development, good ones, motivated by achieving near-perfect uptime for customer-facing services. Implementing a scaffolding service or crontab to that standard was a lot of work.
Then there's the non-adherents who eroded the Methodology. Like the scrum shops that eroded scrum by deemphasising the product owner and stories until the result looked more like a waterfall.
The Methodologies may be broken as a whole but the practice I've seen was generally so distorting that I feel it's unfair to blame the Methodologies.
This reminds me of the people on the far right or left who believe, "[Capitalism/Communism] can't fail; it can only be failed."
Some of the failure I've seen can be partly explained by people who wanted to have their cake and eat it. Who wanted, say, the promised advantages of Scrum but were not willing to pay its costs (lack of long-term plans and fixed finish dates).
That's not all. It's part of the explanation for some of the suckage I've experienced.
I do blame people for not making up their minds. The people who invented scrum were willing to give up some parts of long-term planning, and got remarkable results for that. They are not to blame when others later failed by not giving up blah.
Maybe some blame should go to conslutants who oversold the benefits of Methodologies without stressing the costs. "YES YOU ACTUALLY HAVE TO DO THIS, IT WILL WORK BADLY OTHERWISE".
Note: I understand "failure mode" to mean that rules are being ignored, or guards have been disabled.
* Frequent releases (i.e. do iterations or 'sprints' or whatever).
* Accept that your methodology has bugs and 'fix' those bugs between releases. Most software is horrible and buggy. Don't trust the "methodology gods" that they wrote a perfect piece of working software. It's probably half assed and worked semi-well for their specific use case so they 'released' it along with a reality distortion field.
* Accept that different use cases require different methodologies. Writing space shuttle code? You need vastly different team dynamics to a group of 10 people at a marketing agency running short lived campaigns.
* Follow the UNIX philosophy: don't have ONE methodology that you follow to a T - string together a bunch of small, self-contained rules and team processes that serve your purposes and iterate upon them.
tl;dr fuck scrum. it's the internet explorer of methodologies.
They will not attract the most talented software developers (on average, not in all cases), and the business people for whom the software is a means to an end care more about consistency and predictability rather than quality.
As a result, fungible resources (humans), deeply regimented stories, regular delivery milestones (sprints), and consistent velocity IS the best possible outcome.
I don't think it really matters what kind of company you work for. I've worked for many software and non-software companies and the same issues crop up in both.
The main one is that scrum accelerates the accretion of technical debt, which "business people" can somehow not care about right up until the point where it drives them out of business.
It has some good ideas (retros, sprints, no deadlines) and some terrible ideas (treating team members as fungible, story pointing/velocity, too many meetings, PO has to make decisions about specific pieces of tech debt).
My main problem with it is the teachers, coaches and promoters who take an all or nothing view of it and who treat deviations from the official 'scrum' policy as, by default, problem with the team rather than, potentially, a bug in SCRUM.
I used to think that it was a good base to work from, but after arguing fruitlessly with the people who take a religious approach to it I've come to the opinion that it just needs to be trashed wherever possible, because the problems it does have will only be resolved by moving on to something else. Better to move on sooner rather than later.
So, fuck scrum.
The problem is not software methodologies per se, it is trying to apply a software methodology to software development where the priorities of the methodology are fundamentally at odds with the requirements and goals of the software being built. The root of the problem is the notion that there is one software methodology that is efficient and productive for all possible types of software development. I would argue that there is an optimal methodology for most software but it is a different methodology for different types of software.
If we discard the oft-argued proposition that a PHP website, an embedded system, and a high-performance database kernel can -- and should -- all be developed with the same software methodology then this entire discussion goes away. A software methodology is a tool; they work best when you select the best one for the job.
There's no certificate, role, set of tools or prescriptive process. There's no specification, it's not a product, or job title. There's no one true voice on what DevOps is or isn't. It's about attitude, ideas, customs and behaviours. Culture, paradigms and philosophy. It's a way of thinking, a way of doing and a way of being. Practicing as well as preaching. It's a conversation. It's about taking the best experiences and sharing those with others.
Consider Waterfall where development happens, then it's tossed over the wall to testing. Almost everyone acknowledges that this is a bad idea and so testing and development happens concurrently now, but not necessarily by the same people (can still have testers and developers). I guess DevTest doesn't have a good buzzword sound to it, but it's what we do.
DevOps sees the same issue with how developers finish the work, it gets tested, then it's tossed over to ops who don't really understand what the code does (not their fault, they didn't build it). DevOps brings the two groups closer together so ops still does ops, dev still does dev, but as a group they shorten the cycle so that dev can deal with the real issues faced by the operators instead of always lagging months and years behind. It's not necessarily an organizational change, the critical part is opening communication between the two groups.
In a way, it's an extension of agile (little-a, because I don't mean the shit the coaches sell), where the operators become the customer and development gets feedback from them. It's one of those "obvious" things that for some reason isn't very commonly practiced.
Now, that said, in practice it has issues. Management sees it like your comment and tries to make dev do ops or ops do dev, or mash them into one team (which may or may not work). More likely they try to make the devs do ops work and it's a disaster. It may run well, but features are added slowly. Or features are added, but the operations story is a nightmare. They end up understaffing the group (1 devop = 1 op + 1 dev, right?) and creating problems.
The original agile manifesto didn't have a certificate or set of tools either. There is plenty DevOps tooling, andI've little doubt that management types will hijack DevOps for their own gain and offer certifications. It'll probably be easy for them, since there is no single 'DevOps', everyone has their own idea of what it means to them.
I don't know why people buy into the "when we'll be agile enough, everything's going to be ok; until then let us use this whip and self-flagelate for not being agile enough:
The methodology cannot be effective without creating the right personal dynamics.
Good interpersonal dynamics are important, but the sheer level of irony that the first rule of the Agile manifesto emphasizes deprioritizing the main thing that will actually get you away from waterfall-like development is pretty staggering, IMHO.
Never mind. Forget the tests. If you have a meeting at 11am every morning where anybody who sits down is shouted at then you've done it. You're "agile".
Personally, I am of the opinion that a strong emphasis on test-driven development in the long run will cause waterfall-style development. Tests are all about risk-prevention, instead of risk-mitigation. Prevention eventually becomes exceedingly expensive, whereas mitigation is all about building robustness into the running system. Due to that, the scalability of mitigation systems, such as true micro-services or actor-systems, are inherently more dynamic and cause less latency in development.
I don't understand your last statement. It seems to confirm my position: "... anybody who sits down is shouted at ... ". The process (standing up, not sitting down) is less important than good team dynamics (not getting shouted at).
(edit: down-voters, please share why you down-vote! I'd like to know. Also, please don't down-vote based on opinion, but on weakness of argumentation instead.)
We always used a pseudo waterfall but with much shorter iterations. We called it cinnabun. For example, we would cut 4 CDs a year so our iterations were about 3 months. We would plan what we wanted to do in the 3 months (bugs + features), code it, developer test it and throw it to QA. Once we had a good build, we would distribute it, have a small celebration the start over again.
It's similar to agile, but with the 3 month cycles, you could actually plan and design a lot better because you could see out a little further.
I suspect Agile is so popular because the business doesn't want to think of, make and stick to a decision for 3 months.
http://www.mfagan.com/pdfs/software_pioneers.pdf
http://infohost.nmt.edu/~al/cseet-paper.html
http://se.ethz.ch/~meyer/publications/acm/eiffel_sigplan.pdf
http://www.anthonyhall.org/c_by_c_secure_system.pdf
On concurrency side, there were also SCOOP for Eiffel and Ravenscar for Ada which eliminated race conditions by design. Some methodologies in high-assurance sector were using tools like SPIN model checker for it. People spent a long time talking about those bugs while some design methods just removed them entirely. A lot less debugging and damage might have happened in industry with the aforementioned methods getting way better with industry investment.
https://www.eiffel.org/doc/solutions/Concurrent%20programmin...
As an example, let's say 100 devs jump in. The task is to create a simple Android app, with a requirements statement provided, with server back end, launch it into the app store, support it for some period with bug fixes and improvements, and then declare it '1.0 released' to wrap up the experiment.
What you'd wind up with is a variety of team sizes, a variety of team experience, a variety of development systems used, a variety of outcomes. But all building the same software.
The key would be that as many attributes of each team's efforts as possible would need to be recorded and entered as data to be studied in search of patterns.
Repeat this n times and I believe valuable insights could be gained.
Rather than trying to control for all the variables of team size, experience, method, you control for the end product being targeted and then look for insights into the variety of approaches that teams took.
From the other direction, even if you get the value of controlling for the project itself, that might also add some bias. Could be that for a project with that setup waterfall actually works pretty well, but is it representative of projects overall? Are most software projects comparable to developing a simple Android app with a well defined specification up front?
I do agree that it would be good to do this kind of experiments where multiple teams get tasked with building similar systems to figure out what works. But I don't think it makes sense to actively avoid controlling for variables. That would make the results very hard to interpret and much less usable.
When you don't understand how or why something works, this is how you go about it. Let's make airplane-shaped things out of coconuts.
Does “scrum for surgery” exist? What is an equivalent of “waterfall” in warfare?
Does something like this exist at all?
Letting apart the problem in always finding “above average people” - at least for now - I think that this has a fatal flaw: what happens when someone leaves and/or someone else joins the group?
I suppose that Hospitals and the Army have this happening fairly often, they must have “methodologies” catering for a wide spectrum of talents, and also accomplish satisfactorily results even when dealing with thorny, unexpected problems.
What do they use? Is this a “methodology”? (One important thing that I am not sure is adequately represented in IT methodologies is having an established vocabulary to describe situations: we have “Patterns” but these are low-level, and divorced by the actual business-specific scenario - this is just one example but I think it helps pointing out that IT methodologies are trying to standardize the wrong elements).
Also famously, Kanban originated at Toyota for manufacturing.
Good, Fast, Cheap. Pick two.
That's what you are doing when you are pitting requirements, a time frame, and a budget against each other.
The first problem is this is a company wide process, not just software development. The only thing development tells you is how long it will take given the budget. Development doesn't define the requirements or the budget.
The third statement in the Agile Manifesto, which tends to get overlooked:
"Customer collaboration over contract negotiation"
That is entirely a business process and ultimately determines both the requirements and the budget. It is something that is sorely lacking at most companies, regardless of how hard their engineering department tries to follow agile. It doesn't work without the full company buy-in.
Another thought experiment: imagine getting two teams of programmers using the same methodologies and everything else and expecting the results to be the same. It's just not practical to perform studies like this because there are too many variables.
Of course, not many people have the resources to do that kind of experiment.
That's what I meant about it not being practical. Who would invest that amount of money? There's infinite variations of different methodologies as well.
I'm not sure what the solution is but it's tiring seeing bad studies used to promote certain approaches.
The first team does A in "Agile" and B in "Waterfall"
The second one does A in "Waterfall" and B in "Agile"
I bet working like this would pull out at least some interesting stuff
The most important thing any software team needs is proper logging, monitoring and metrics. No matter how great your process and engineering culture, you'll need logging, monitoring and metrics; things will happen. The worst part is that this is relatively cheap and simple to do (at the scale that most of us operate on), with huge rewards, and most teams still do it wrong. Whether they're logging too much noise, or collecting metrics that show what's right vs what's wrong, or swallowing exceptions, etc. This is the litmus test.
Next are automated tests. Unit tests, integration tests and fuzz testing. The downside with this is that it takes a long time to master. Yes, it costs time at first, but that's why you have senior developers who should be able to use tests to save time and teach others from their mistakes (like too much mocking).
Finally, code reviews and pair programming. Almost every line of code is an opportunity to teach and to learn. No methodology or tool can help if you hire junior programmers and don't do pair programming (or some other really involved mentoring, but I don't know of any).
Technical debt is real. Most of the time people don't have time to do things right is because they didn't take the time (or didn't know how) to do things right in the first place.
Most of the struggle with methodology research is due to the difficulty of objectively measuring code productivity or quality.
Agile/waterfall/etc are PM methodologies used to manage software devt projects. Especially Agile has limited use beyond software devt.
Methods of deving software are things like: domain driven, data models first, TDD, etc.
Then there are programming paradigms: OOP, FP, Actor based, etc.
So the set of "techniques" the article lists in one list are of different types. All of these have to be evaluated against the people that have to use them (don't for OOP on an FP team, or vise versa) and the type of problem to be solved (TDD is less usefull for a simple UI project, than for a complex algorithm involving time and lots of corner cases).
You star with a premise that requirements, time constraint and budget are all set magically right before the project began.
Ultimately a lot of people tend to lose track of higher level objectives when working on the (admittedly complex at times) implementation details. This is probably the biggest productivity killer in the business
How many of us have had that first demo with some people external to the dev team and for all the feedback to be super obvious things that could have been caught before a single line of code was written?
It's not a regular pipeline kind of workflow, like some Taylor-inspired assembly line, or regular old civic engineering.
Besides all those methodologies are unscientific BS invented by consultants, not something derived from actual studies (even when there are some comparative studies involved they are laughable in scope by scientific standards).
After all, if you were repeating yourself, you'd just re-use the methods and classes and packages you'd already written; worst-case, you could copy+paste the code and tweak it.
And since you're doing something novel, of course you're not going to be able to predict how long it will take, beyond extremely broad guesses.
What I have seen work is methodologies adopted to get the benefits of what the methodology actually delivers. Not a checkbox so say "we are X".
With all respect, I don't misunderstand TDD or OOP. I agree that those aren't methodologies in the strict sense. But rigid adherence to OOP design or TDD can paralyze a team and focus development on goals that aren't the customer's priorities. OOP and TDD can influence how the team works and what shape the project takes just as much as waterfall or agile. When I read articles claiming that TDD is the sure path to reliable development, that's a methodological claim, not a technical practice.
Better, faster, cheaper, you get to choose 2 and only 2, we've selected faster and cheaper.
The one true methodology. Preach!
There seems to be a false premise. Methodologies don't deliver consistent results on time and within budget because they attempt to help to figure out those software requirements, so that an actual problem can be solved, not useless requirements satisfied.
This is, in fact, the answer.
You haven't truly SEEN office politics until you've worked on a team of developers. I'm shocked a reality show hasn't come out yet about software development. It would make Survivor look like Family Matters.
I get along with my co-workers too, but social skills are more than just playing nice together.
This run's counter to a lot of my experiences. I don't tend to see much office politics at all with software developers. Maybe because salary isn't tied as closely to rank and it's so easy to shop around and jump ship if you feel like you're getting fucked.
It's the same story at every company: developers are under constant pressure from management, we aren't given the tools or help we need at the right time, we get no credit, but all the blame when things go wrong. When people feel threatened, the open and creative part of their brain shuts down and it becomes fight or flight and every man for himself. Devs become extremely rigid and dogmatic, as every bad work experience leaves a scar on us, and they resolve to never EVER make the same mistake again.
"Hell is other people" is like the personal motto of some programmers. You don't agree with me, fine, I'll just refactor your code when you're sleeping. People read random blog posts and then take it as holy writ, undisputable proof, and if you disagree then let me write a long email lecture to educate you about why I'm right and you're wrong. I've seen people get into physical FIGHTS at scrum, resulting in broken bones. I wish I was making this up.
But that's the problem to be successful you have to be good at politics.