Hi @SamM-0000
Not sure if you ever got this resolved for yourself.
I went through a similar painful journey a few years ago… I describe it in the last comment here:
I am not an expert on the Umbraco source code, so I am happy to be corrected, but my findings might point you in a helpful direction.
I belive that when you perform CRUD operations on a member record, the member content tree is locked for the duration of the operation. This is also true for content.
In a nutshell, you can only update one member at a time or one content node at a time.
The lock occurs at the database level.
So although the web app itself can handler 100s of concurrent requests, (and i find it is very performant at serving static data) the database locking mechanism will only allow one member/node to be updated oncurrently. All other requests have to wait. Given enough requests, you start getting timeouts waiting for the database lock to release. The more updates, the slower the response time, the longer the lock is held for and… you get the idea.
Again, I am not an expert on the source code, just sharing my findings. So all this could be nonsense.
But take a look at the MemberService.cs Save() methods. (Last time I checked was V13.something and this was still thre). You will see calls to
scope.WriteLock(Constants.Locks.MemberTree);
From what I understand, this only allows you updaate one member at a time.
But it gets worse if you use any notification handlers.
The lock is enabled BEFORE notification handlers are called and released AFTER they all finish. So if you make any downstream API calls from within a notification handler (for example, sending an email, creating a PDF), the lock is maintained for the duration of those calls too, making the issue even worse.
Again…. these are only my own observations. I like Umbraco and find it very performant when serving data and the backoffice has been a great tool for us. But concurrent updates has been a constant issue for us. Enough for us to move all members to a custom identity server implementation and we are considering moving large portions of our content nodes to custom apis in the future.