A recent Pew Research Center survey finds that only half of American adults think colleges and universities are having a positive effect on our nation. The leftward political bias, held by faculty ...