Softmax and sigmoid, composing exponential functions (ex) and division (1/x), are activation functions often required in training. Secure computation on non-linear, unbounded 1/x and ex is already challenging, let alone their composition. Prior works aim to compute softmax by its exact formula via iteration (CrypTen, NeurIPS ’21) or with ASM approximation (Falcon, PoPETS ’21). They fall short in efficiency and/or accuracy. For sigmoid, existing solutions such as ABY2.0 (Usenix Security ’21) compute it via piecewise functions, incurring logarithmic communication rounds.