christianitymedical dictionary

The religion stemming from the life, teachings, and death of jesus christ: the religion that believes in god as the father almighty who works redemptively through the holy spirit for men's salvation and that affirms jesus christ as lord and savior who proclaimed to man the gospel of salvation.

(12 Dec 1998)