Recent studies show offering dental benefits is an important way companies can appeal to job candidates and keep existing employees happy. Three-quarters of employees feel that it is very important that their employer provides dental coverage, and this is true of employees of every age, according to a survey by National Association of Dental Plans (NADP). Although workforce trends change as older employees retire and younger ones are hired, dental insurance shows no sign of lessening in importance.…