Cu wires were bonded to AlSi (1%) pads, subsequently encapsulated and subjected to uHAST (un-biased Highly Accelerated Stress Test, 130 °C and 85% relative humidity). After the test, a pair of bonding interfaces associated with a failing contact resistance and a passing contact resistance were analyzed and compared, with transmission electron microscopy (TEM), electron diffraction, and energy-dispersive spectroscopy (EDS). The data suggested the corrosion rates were higher for the more Cu-rich Cu-Al intermetallics (IMC) in the failing sample. The corrosion was investigated with factors including electromotive force (EMF), self-passivation of Al, thickness and homogeneity of the Al-oxide on the IMC, ratio of the Cu-to-Al surface areas exposed to the electrolyte for an IMC taken into account. The preferential corrosion observed for the Cu-rich IMC is attributed to the high ratios of the surface areas of the cathode and anode that were exposed to the electrolyte, and the passivation oxide of Al with the lower homogeneity. The corrosion of the Cu-Al IMC is just a manifestation of the well-known phenomenon of dealloying. With the understanding of the corrosion mechanisms, prohibiting the formation of Cu-rich IMCs is expected be an approach to improve the corrosion resistance of the wire bonding.