Shrinking a Power Supply and the Challenge to Maintain High Reliability White Paper
In the last few years, as the technology of power supplies has changed we have seen tremendous improvements in the power density of solutions being offered by many leading companies. This is in direct response to customer demands of smaller solutions, lower cost and increased reliability. However, as you reduce the size of your power supply you need to ensure that component temperatures are kept low, or run the risk of reduced reliability.The
If a power supply could convert all of the power entering into it into usable power output, it would be 100% efficient. However, a certain amount of energy is ‘lost’ during each conversation stage as you convert from one voltage level to another. In the process of these conversion stages a power supply will consume some energy. Typically this is expressed as a percentage.
You will recall from first principals that energy cannot be created or destroyed, but instead can only be changed from one form to another. Any inefficiency in a power supply will be converted to heat, as this is the only medium for a supply to convert the energy. This will cause a temperature rise internally on components, and will cause a reduction in reliability of the power supply. In light of this it is imperative that power supply designers have the required skill-set to deal with this heat. In electronic equipment, the most prominent stresses are temperature, voltage, vibration, and temperature rise. The effect of each of these stresses on each of the components must be considered. In order to achieve good reliability, various derating factors have to be applied to these stress levels. The derating has to be traded off against cost and size implications. It will require careful consideration if the final design needs additional cooling.
Download the full White Paper below.