Due to their limited functionality, ubiquitousconnected devices in the Internet of Things rely heavily onthe computational and storage resources of the cloud. However,mainstream cloud systems always require high network bandwidth and cannot satisfy the delay requirement of real-timeapplications. Therefore, a new paradigm called multiaccess edgecomputing has emerged to offload the computation and storageneeds of end user devices to the edge cloud servers located in theradio access networks of 5G mobile networks. In this paper, westudy and compare three load sharing schemes, namely, no sharing, random sharing, and least loaded sharing, which exploit thecollaboration between clustered servers in different degrees. Wedevelop computationally efficient analytical models to evaluatethe performance of these schemes. These models are validatedby simulation, and then used to compare the performances ofthe three load sharing schemes under various system parameters.Comparison results show that the least loaded sharing schemeis most suitable to fully exploit the collaboration between theservers and achieve load balance among them. It contributes toreducing the blocking probability and waiting time experiencedby users.