I intended this series to be about fundamentals. This week's post is no different. It's good for our country when business (both large and small) make money. To me this seems obvious, but to a number of politicians, I believe this may be a new concept. Because of that, if this sounds a little simple, I apologize ahead of time.
For the most part, we benefit as a nation and as a community when businesses make profits. When American oil companies make a profit, it's good for us. When pharmaceutical companies make a profit, also good. When Wal-Mart has a good year, it's very good. Remember, these companies do what they do because they are in the business to make money. You don't go to work expecting to donate your time to your employer. You expect to get paid. Why should the oil industry be any different?
Why is it good? When Wal-Mart makes money, their stock goes up and their stockholders make money. Typically, we think of stockholders as stuffy rich white guys who smoke cigars in their personal libraries at home. In some cases this is true. However, teachers unions often invest in successful companies. State Employees Unions do as well. As do many 401 k and retirement funds. Not to mention the employee that has worked at a store their entire career and has been able to purchase stock.
When companies make a profit, they are able to expand their businesses. These expansions create jobs. You might be tempted at this point to say, "Ah! But what if those jobs go overseas?!" Fair point. However, if they are going overseas for reasons we can change, then we should make changes to make the U.S. more appealing for business. For example, instead of punishing business through tax laws, why not make our tax system more appealing to business? If more businesses are here, that creates a larger tax base and higher employment.
When we talk about free health care or windfall profit taxes on any business, we are forgetting that these businesses exist to make a profit. Why are we punishing this? The American Dream has been to create a business, become successful at that business, and make money doing something you love. The government shouldn't punish this.
One last point. In most cases, businesses have a financial incentive to do what their customer's want. If their customers want more fuel efficient vehicles, auto manufacturers will make more fuel efficient vehicles. If their customers want it, they can sell more, and that is why they exist in the first place. The only time government needs to mandate something like this is when customers don't want it in the first place.
The government makes a number of laws that cut into business, stifle business, or just force companies to close up shop. There are only a few representatives who started out in business and understand what businesses go through. Many of these career politicians should spend some time understanding the businesses (especially in their district) before discussing topics like "windfall profit taxes".
Without business, where would we be? If we crush business by forbidding them from making a profit, who will stay in business?