Most agree that the spread of Christianity must have had something to do with it, but that can’t have been the direct cause, since the Church itself was never explicitly opposed to the institution and in many cases defended it.
So, did those in the U.S. consider it a Christian national?