People typically have their teeth whitened to improve their appearance and build their self esteem. Whiter teeth are associated with beauty, cleanliness and a healthier lifestyle. A whiter smile tends to give you a more youthful and energetic appearance. When people have brighter, whiter teeth they tend to smile more often and are less self-conscious. Whiter teeth also give people with whom you are speaking a place to focus and it gives you a friendlier smile.