Sound is a tricky thing in space. Sound is a pressure wave, an oscillation in the density of air or water, which moves through the air or through water until it reaches something it can rattle. If that sound is reaching a human ear, and if the oscillation is within the range of frequencies we are sensitive to, it will be heard. Our Earth produces a number of these pressure waves, from the sound of a person next to you speaking, or the crash of a wave on a beach, or a sonic boom of an airplane above you. However, there are plenty of sounds produced which we are outside our range of hearing - with an instrument tuned to receive those pressure waves, we can prove their presence, but it would be impossible to play back and hear it without speeding up the recording.
In space, we have a major problem with recording sounds; there’s no atmosphere for sound waves to travel through, so any pressure waves an object may be producing will be instantly silenced without a medium to compress. However, if you’re clever about it, there are other ways of recording information which can be translated into a sound; the easiest one is vibrations. The ‘crunch’ of Philae, Rosetta’s lander on the comet 67P, hitting the surface of the comet made the rounds - but this noise is not, in fact, the result of a microphone on the lander. This noise is a translation of the vibrations of the feet of the lander at the moment when it hit the surface of the comet.
However, if you want to translate a data set into sound, you’re not limited to just dealing with vibrations. You can turn pretty much anything into a set of tones, if you’re creative enough. Sonification is a booming area of data manipulation -- it’s another face of the data visualization scene; instead of presenting the information visually, you can code it audibly, and listen to it over time. You simply have to decide what you want the pitch of the musical note to correspond to, what you want the timing between each note to correspond to, and what you want the volume to correspond to.
For data coming from a spacecraft which monitors the Sun, there is often a new image every hour and a half or so. In this case, the pacing between notes is easily given to the time between observations, which will form a regular cadence. However, extracting a volume and pitch out of the data will depend very much on exactly what part of the data you’re interested in reflecting.
For Rosetta’s “singing comet” sonification, there was a very low frequency oscillation in the magnetic field surrounding the comet, measured by Rosetta. The decision here was to use the frequency of that vibration in the magnetic field as the pitch, but sped up by a factor of 10,000, so that it could register in the human ear. The volume here is driven by how large the oscillations were, much as it would be for a sound wave on Earth.
For the Sun, not only do we have to speed up the oscillation, but we also have to choose which observed oscillation we want to convert into a sound. The most common choice that I found was to examine the roiling surface of the Sun, which resembles nothing more than a pot of water at high boil. You could imagine examining how high the bubbles rise above the surface, and how quickly they do so. If we convert this amplitude and rapidity into a volume and a tone, we can get a musical note out for every bubble that rises to the surface. This is only one of many possible optionsfor sonifying the Sun, but it seems to be one of the more common choices.