Anything smart enough to deserve the label superintelligent would surely be smart enough to lay low and not disclose its existence until it had taken the necessary steps to ensure its own survival. In other words, any machine smart enough to pass the Turing test would be smart enough not to.