I'm a freshman in college and I keep thinking about whether I should or shouldn't become a doctor. There there seems to be new articles all the time about how becoming a doctor isn't such a hot career anymore due to decreasing reimbursements, increasing practice costs, having to see more patients everyday, fighting with insurance companies, etc. For those of you who will argue and say that medicine is not the career for you if you are looking for money, I understand that and money is not my only incentive for becoming a doctor. I like the actual career itself and would enjoy doing it everyday. Many parents who doctors are telling their kids not to become doctors because they think its just not worth it. I don't clearly remember the statistics but I remember around 30%-40% of doctors are unsatisfied with their career decision and would change their mind if they had chance. Many of my family members who are doctors as well are saying that I should avoid doing medicine. My whole point to all this is that I don't mind enduring all the years of schooling and I like the profession but I don't want to regret it someday because of all the changes that are going on. Since many of you are doctors already or in the process of becoming one, I want to know what your input is about the situation. Are there any hopes that things will improve, stay the same, get worse or that nothing is certain just yet?