The human hand is particularly important in Computer Vision and Graphics because it is used for a wide range of actions and behaviors, such as grasping, pointing, and gesturing. Human hand shape modeling has a long history, with recent approaches employing statistical models computed from tens of subjects. However, there are currently no diverse, large-scale datasets and models of human hands, leaving a huge gap when it comes to capturing the full extent in the possible hand shape variations across different ages, genders and ethnicities. We propose to bridge this gap through a novel crowdsourcing solution to scale the number of scanned subjects to the thousands, followed by a framework for robust processing the 3D hand scans and constructing next- generation, large-scale hand models, with unprecedented expressiveness and realism. More specifically, we aim to develop an easy-to-use and inexpensive hand scanning process and construct a global hand model accounting for all demographics, as well as tens of additional bespoke models, tailored for specific demographic groups. The source code, processed 3D hand scans and constructed hand models will become publicly available. The project will have a multitude of direct applications, ranging from gesture recognition and AR/VR to rehabilitation and prosthetics.