Interactive Segmentation (IS) segments specific objects or parts by deducing human intent from sparse input prompts. However, the sparse-to-dense mapping is ambiguous, making it challenging for users to obtain segmentations at the desired granularity and causing them to engage in trial-and-error cycles. Although existing multi-granularity IS models (e.g., SAM) alleviate the ambiguity of single-granularity methods by predicting multiple masks simultaneously, this approach has limited scalability and produces redundant results. To address this issue, we introduce a creative granularity-controllable IS paradigm that resolves ambiguity by enabling users to precisely control the segmentation granularity. Specifically, we propose a Unified Granularity Controller (UniGraCo) that supports multi-type optional granularity control signals to pursue unified control over diverse segmentation requirements, effectively overcoming the limitation of single-type control in adapting to different needs, thus boosting the system efficiency and practicality. To mitigate the excessive cost of annotating the multi-granularity masks and the corresponding granularity control signals for training UniGraCo, we construct an automated data engine capable of generating high-quality and granularity-abundant mask-granularity data pairs at low cost. To enable UniGraCo to learn unified granularity controllability in an efficient and stable manner, we further design a granularity-controllable learning strategy. This strategy leverages the generated data pairs to incrementally equip the pre-trained IS model with granularity controllability while preserving its segmentation capability. Extensive experiments on intricate scenarios at both instance and part level demonstrate that our UniGraCo has significant advantages over previous methods, highlighting its potential as a practical interactive tool. Code and model weights are available at https://github.com/Zhao-Yian/UniGraCo.