I recently bought a few shirts from a brand that I love. Their shirts used to be 100% made in the USA, but now their description states "assembled in the USA with foreign materials". Unfortunately, I didn't realize this until after they arrived and they were on final sale so I cannot return them.
Now I know the "made in the USA" can be a big marketing ploy, but immediately upon opening the package I noticed how the new shirts felt significantly cheaper compared to the older ones I have. I understand the cost of foreign goods is lower and that you have to save every penny you can to make it in this day in age, but I was disappointed to find out this brand now seems to be cutting corners with their manufacturing thus resulting in a cheaper feeling product. Sadly, the brand has now lost a customer.
Anyway, do you all care if your products are made in America?
Do you feel you get a better product when it is made and produced here, or do you find it to be nothing more than a marketing scheme to get you to pay more?
Now I know the "made in the USA" can be a big marketing ploy, but immediately upon opening the package I noticed how the new shirts felt significantly cheaper compared to the older ones I have. I understand the cost of foreign goods is lower and that you have to save every penny you can to make it in this day in age, but I was disappointed to find out this brand now seems to be cutting corners with their manufacturing thus resulting in a cheaper feeling product. Sadly, the brand has now lost a customer.
Anyway, do you all care if your products are made in America?
Do you feel you get a better product when it is made and produced here, or do you find it to be nothing more than a marketing scheme to get you to pay more?
Comment