Taking care of your skin is so important – and it’s only something I started doing within the last few years. I love the sun and I love being tanned so I never cared about wearing sunscreen or hats, I only cared about getting as dark as possible. Big mistake. I know that now. I also used to not care about taking off my makeup, or washing my face properly. Another big mistake lol.
I’ve learned a lot in the last few years about skincare and sometimes it can become very overwhelming. There is literally a cream or serum or oil for every part of your body. So what is important and what products are good?