By the sounds of it the first point is handled by having essentially a year long probationary period, and then another two year period before someone becomes fully entrenched in the org as a full partner. This is almost certainly a long enough time to determine if someone is going to be a piss taker or not and so other instances of underperformance can be handled via supportive mechanisms.
It’s worth highlighting that performance “curves” in some companies seem to lay off reasonably productive people and preserve people who are great at gaming the system/metrics.
For conflict resolution I don’t know how they do it, but if I were in charge of this I’d probably have a dedicated body like an HR set up for this which would be democratically accountable but ultimately still deal with that kind of thing as a last resort (assuming it can’t be sorted out between team members).
Many worker co-ops have been resilient to recessions as members often choose to temporarily lower their own pay/share of profits rather than having layoffs or other similar arrangements. https://www.yesmagazine.org/issue/new-economy/2009/06/06/mondragon-worker-cooperatives-decide-how-to-ride-out-a-downturn
I guess it depends too much on the nature of the crawler. Does it actually extract links from robots.txt or is it merely ignoring them? If the crawlers are distributed, do page hits come from the same IP that the robots.txt was hit from?
It gets harder and harder to get away from CDNs and captchas, which are not exactly good things from an open source POV for the most part.