To sell a product, you need to add it in your catalog. This fact is clear to any shop owner. Unless you have sufficient interesting products that can maintain your customer browing your shop, this same customer will use the competitor’s site and will not buy of your shop as a consequence.
All shop owners have to find the balance to find the right amount of products, maintain quality data for these and showcase these data to the customers to entice them to buy as often as possible!
Hosting architecture have transformed our capabilies. Whilst we could aim to have 50k products in our catalog 10 years ago, we have now started to accept that a shop with 200k products is ok to implement.
Magento framework can handle this without any fancy features other than what we have in our LAMP/LEMP stack. But the problem comes when performance degrades. By default, Magento is designed to handle more than 200k but sadly, in practice, I have witnessed database performance when we blindly add products without a good organisation to get this process right.
The recipe to get it as wrong as the shop I tried to rescue, I evaluated the number of site’s specific product attributes was 3 TIMES higher than the default figure. In the detail, this number was due to overlooking at organising the data before performing the data import.
This performance problem was unfortunately not something that could be fixed once the site was live unless we were implementing a costly reingineering of the data and backup of the orders and the likes. Instead, the shop owner was doomed to inflate the hosting to compensate that error.
Should I hire a database expert when I start my shop? Not at all, scaling a catalog is possible and like most thing, it is not so much gaining data expertise that is critical but rather embracing that the default framework does a great job. We just have to be wary that changing the default inner features has a cost, whether the change is in the codebase or the database, the more we change, the more risk we bring to the business