How do I check for phase?

Recording Techniques, People Skills, Gear, Recording Spaces, Computers, and DIY

Moderators: drumsound, tomb

Post Reply
User avatar
Silverlode
gettin' sounds
Posts: 118
Joined: Sat May 15, 2004 12:08 pm
Location: Chicago
Contact:

How do I check for phase?

Post by Silverlode » Mon Dec 12, 2005 11:06 am

Phase this phase that. I know it's important to have drum mics in phase, for example, I'm just wondering how the heck I can "check" for this. I don't have a mixing board. Everything basically goes straight into Logic Pro through my I/O. What's the best way? I don't even know what to listen for.

School me. Thanks! :P
y.t. > Silverlode
http://thebside.org

User avatar
JohnDavisNYC
ghost haunting audio students
Posts: 3035
Joined: Fri Oct 03, 2003 2:43 pm
Location: crooklyn, ny
Contact:

Post by JohnDavisNYC » Mon Dec 12, 2005 11:19 am

a good way to learn to hear phase relationships:

1: set up a guitar amp.

2: put 2 57s (or any 2 of the same mic) at exactly the same distance and right next to each other from the speaker.

3: record this sound.

4: listen to what happens when you have them panned up the center and flip the phase (polarity) on one channel with a 'helper' plugin like gainer or something in logic.

5: move one of the 57s back 2 inches from the speaker and repeat steps 3 and 4.

you should hear wierd cancellations between the mics as different frequencies arrive at the 2 mics at the same time. experiment with different mics, positions, etc....


for drums, you can do something similar with overheads. change the height of the mics and stuff and listen to them with the snare mic and see how the timbre changes with height and polarity flipping.


there is no concrete 'rule' but this should give you an idea about how phase relationships affect the sound.

copy a track in logic and use the sample delay on one, and you can hear the comb filtering from varying degrees of 'out of phase-ness'.

phase is not an absolute, it is about the relationships between a source and multiple mics... sometimes a wierd phase relationship between a room mic and an overhead can be just what you needed to get the 300hz honk out of a snare drum.

hope that helps a bit.

cheers,
john
i like to make music with music and stuff and things.

http://www.thebunkerstudio.com/

User avatar
vvv
zen recordist
Posts: 10164
Joined: Tue May 13, 2003 8:08 am
Location: Chi
Contact:

Post by vvv » Mon Dec 12, 2005 11:29 am

"Out of phase often sounds "thin"..

I do not know about Logic, but Cool Edit and other programs have the ability to check phase within the program.
bandcamp;
blog.
I mix with olive juice.

User avatar
Meriphew
deaf.
Posts: 1759
Joined: Sun May 11, 2003 9:56 am
Location: Seattle USA

Post by Meriphew » Mon Dec 12, 2005 12:38 pm

Clik to left = phase. Click to right = kill.

Image

mn412
gimme a little kick & snare
Posts: 77
Joined: Sun Jun 15, 2003 1:53 pm

Post by mn412 » Mon Dec 12, 2005 12:38 pm

If you record a section of the drums and zoom in on the waveforms you can check and see if everything is in phase. I guess some would consider it cheating but its a good way to double check.

drumsound
zen recordist
Posts: 7484
Joined: Tue Jun 01, 2004 10:30 pm
Location: Bloomington IL
Contact:

Post by drumsound » Mon Dec 12, 2005 1:06 pm

When I'm getting drums sounds I pan all the returns to the left. I then listen to different mics and hit the polarity button to see if the sound changes and if it's better or worse. If two mics (esp OH and room) are in good 'phase relation' hitting the polarity reverse will make the source sound really thin. It's easiest to hear with a coincident pair.

joel hamilton
zen recordist
Posts: 8876
Joined: Mon May 19, 2003 12:10 pm
Location: NYC/Brooklyn
Contact:

Post by joel hamilton » Mon Dec 12, 2005 1:10 pm

mn412 wrote:If you record a section of the drums and zoom in on the waveforms you can check and see if everything is in phase. I guess some would consider it cheating but its a good way to double check.
This is misleading. You cant "see" the duty cycle of all the frequencies. I guess this could work for arrival times, but not for phase. The duty cycle of 10k... that is 10,000 times per second... That doesnt "show up" as anything useful in PT anyway.

Someone once said on this very messageboard a few years back: "learn to be allergic to phase." That has stuck in my brain. Learn to "feel" it even if something is slightly "twisted" rather than just 180 out. You can use phase relationships to color the overall recording. More "amateur" recordings are ruined by bad phase relationships ("bad" meaning unflattering) than mic selection, gear, performances...anything...combined.

Try what john (toaster 3000) suggested, and then learn to apply that to drums, or any muli-mic'd source. You will start to need less EQ, you will hear the original gesture remain intact throughout the process of mixing... More clarity, more "oomph" all around.The more I record, the more i realize that getting your phase relationships to work for YOU is really one of the most important things to making a great recording. learn this and all the gear in the world becomes gravy. I perfectly phase coherent recording using mackie mic pre's will blow away a smeary, phasey recording made with 1073's any time. really.

User avatar
Doublehelix
takin' a dinner break
Posts: 172
Joined: Sat Apr 03, 2004 5:59 pm
Location: USA
Contact:

Post by Doublehelix » Mon Dec 12, 2005 1:55 pm

You make some great points Joel, but this is often a very difficult thing to do when you have 15 mics going at once, and trying to worry about all the phase relationships between them.

As mentioned, 180 phase problems are pretty easy to hear, but it is all of those in-between phase issues that really cause the most problem, and are sometimes difficult to detect, and even harder to correct, especially with so many possible combinations and permutations of mic placements.

Something like the Little Labs IBP can be a real lifesaver, but again, with a lot of mics, it is certainly not an easy thing to do.

And just to add to Joel's comments again on zooming in on the waveform...

When you zoom in and match the waveforms like that, you are time-aligning, but not (neccessarily) phase-aligning. Phase is *very* frequency dependant as Joel points out. Read the IBP user's manual online at the Little Labs site for some more info on phase, and what exactly an IBP does inside of the box...it is a lot more than just delaying the signal.
DH

"Nobody goes there anymore; it's too crowded."
-Yogi Berra

joel hamilton
zen recordist
Posts: 8876
Joined: Mon May 19, 2003 12:10 pm
Location: NYC/Brooklyn
Contact:

Post by joel hamilton » Mon Dec 12, 2005 2:06 pm

Doublehelix wrote:You make some great points Joel, but this is often a very difficult thing to do when you have 15 mics going at once, and trying to worry about all the phase relationships between them.

.
But that goes with the territory. 15 mics with poor phase relationships are not additive. I would rather have 1 mic on the drums than 15 with shitty phase relationships for sure. Maybe two... one for the kick... ;)

User avatar
Silverlode
gettin' sounds
Posts: 118
Joined: Sat May 15, 2004 12:08 pm
Location: Chicago
Contact:

Post by Silverlode » Mon Dec 12, 2005 2:40 pm

Thanks everyone...I'm going to take myself to phase school and try some of these things out. Great discussion.
y.t. > Silverlode
http://thebside.org

Professor
ghost haunting audio students
Posts: 3307
Joined: Wed May 07, 2003 2:11 pm
Location: I have arrived... but where the hell am I?

Post by Professor » Mon Dec 12, 2005 5:49 pm

Before you take your self off to phase school, let me see if I can back this up just a little further. There's lots of great info up about recognizing and dealing with phase, but I think that it may be worth actually explaining what the hell it is.

Contrary to how "sound" is displayed inside your computer, actual sound waves do not travel as cute little 2-dimensional waves radiating from a source but rather as a 3-dimensional pressure wave. The pressure of the air around the source is increased and decreased in a repeating pattern. As the pressure changes it can be measured at a point in space. The most convenient tool for measuring the repetitive pressure changes of air in real time is a microphone which transduces (changes one form of energy into another) the acoustic pressure waves into an alternating current voltage. Once that change has been made and the sound has been converted into a voltage we can do all the crazy things we like to do with that "signal" including converting the voltage into computer data and representing the increase and decrease in pressure with that familiar little "sound wave" graph you see in your computer.

OK, so what about this "phase" stuff?
Well, as I said, the microphone measures the changes and transduces them into an alternating current voltage. When the pressure is increasing, there is quite literally, air pushing the diaphragm of the microphone inward, and the microphone is designed to convert this into a positive voltage. And naturally, when the pressure is decreasing, the air is sucked away from the diaphragm, and the microphone converts this to a negative voltage. When we display this on your computer screen, you see a graph that swings up and down, positive and negative, over and over again, visually representing the changes in air pressure created by the sound 'wave'.
On the other end of your system, your speakers receive an alternating current wave and turn it back into pressure waves in air. Quite literally, they push out for positive voltages and pull back for negative[/u] voltages. In an ideal situation, the pressure waves created by the live source are converted, saved, and then later reproduced with the same push and pull, increase and decrease, "compression and rarefaction" of the air so that you hear an accurate replica of the original acoustic event. There are many arguments about whether we can hear the difference in "absolute phase" which is the relation of a single mic to speaker and whether a wave is indeed pushing or pulling, because by nature they do both - very fast. It's easy to make the argument against hearing absolute phase on a sustained high note from say a flute, but also equally easy to argue in favor of hearing absolute phase when hearing a short, percussive, low frequency sound like a bass drum hit. But that's a discussion of a different sort.

Where it gets tricky is that we are never really recording just one single frequency at one single point in space. Rather, we are recording complex combinations of frequencies which push and pull with and against each other. That's why the 'waves' you see represented on screen never look as neat and tidy as the textbook sine, saw, triangle or square waves.
But that still doesn't account for the phasing issues we are always discussing, because those arise from capturing sounds through multiple microphones.

Sometime we catch a single sound in multiple mics by accident (which we call bleed) and sometimes we do it on purpose, for example on a drumset. We can pretend all we want that the snare drum mic will only hear the snare drum, but it hears everything, just like all the mics do. And we run into this awkward difference in speeds - the difference between the speed of sound at about 340-350 meters/second vs. the speed of light at about 300,000,000 meters/second.
You see, when the sound is travelling through the air, it is travelling at, well, the speed of sound in air. But when the sound reaches a microphone and is converted into a voltage, then it starts travelling at the speed of light where that voltage is concerned - though back outside in the "real world" that sound is still travelling at the speed of sound, through more air, to arrive at the next microphone.
For all intents and purposes, we tend to consider the electrical part "instantaneous" but that acoustical speed of sound in air is slow enough that we count it as roughly 1ms per foot. It doesn't seem like much, but 1ms is amazingly slow in the world of physics, and in frequency terms, it is 1 full cycle (up and down and back to zero) of a 1,000 Hz tone, or 1/2 of a cycle (up and back to zero) of a 500 Hz wave. That means that if you have two microphones about 1-foot apart, and a 500 Hz sound is generated, then the pressure wave will be increasing in one microphone while it is decreasing in the other. A 1000 Hz sound will have the mics working together again, though they will be 1 full cycle off.
So that's where the problem starts to rear its ugly head. You might have a sound that is pushing in one microphone while pulling in another. The trouble is that once those sounds become voltages, then everything is travelling at light speed (literally) and when you go to combine those signals together, one voltage moving positive added to another voltage moving negative equals ZERO.
And that ain't all! Let's say your snare drum is tuned to 500Hz, and also has some overtones at 1kHz, and at 1500, and 2k, etc. Well the 500Hz signal would be hitting out of "phase", while the 1kHz is "in phase", and the the other overtones fall in varying degrees. That means that some frequencies will be cancelled while others are boosted by the two microphones working together.

That's the basis of the problem - an issue of timing differences between the sound arriving at one microphone before arriving at another.
Conveniently, that is also the source of the best cure for the problem which is to "time align" the microphones. We know the sound will arrive last at the microphone which is furthest away, and so we can "delay" the sounds for the closer microphones so they line up with the last mic. Once we account for the acoustical delay caused by the speed of sound, much of the issues of phase are fixed - not necessarily all of them, but most of them.

Hopefully that puts all the other advice up there into a bit more perspective.

-Jeremy

bniesz
takin' a dinner break
Posts: 188
Joined: Fri Oct 17, 2003 10:12 am
Location: Cincinnati, OH
Contact:

Post by bniesz » Mon Dec 12, 2005 10:34 pm

joel hamilton wrote:Someone once said on this very messageboard a few years back: "learn to be allergic to phase." That has stuck in my brain. Learn to "feel" it even if something is slightly "twisted" rather than just 180 out.
That reminds me of a story:
I hooked up the monitors in the studio for some tunes durring setup or cleanup once. Went into the control room to cue up some iTunes and the second I walked back into the studio I started to freak out and get kinda dizzy. I ran to the monitors and flipped the phase on one of em. then sighed in relief.

User avatar
wedge
tinnitus
Posts: 1088
Joined: Tue Jul 06, 2004 9:08 pm
Location: Washington, D.C.

Post by wedge » Mon Dec 12, 2005 10:50 pm

joel hamilton wrote:A perfectly phase coherent recording using mackie mic pre's will blow away a smeary, phasey recording made with 1073's any time. really.
I had no idea that it was that important. I always thought it was an issue mostly with bass drums. I'm beginning to think that my bands mixes are repleat with phasing issues, due to total ignorance of it during the recording process. The mixing is mostly done. Is it possible to address phasing problems at the stereo-mix level? Mastering engineers can deal with phase issues, right?

Professor
ghost haunting audio students
Posts: 3307
Joined: Wed May 07, 2003 2:11 pm
Location: I have arrived... but where the hell am I?

Post by Professor » Mon Dec 12, 2005 11:18 pm

No, there ain't much a mastering engineer can do to repair phase problems built into the mix - at least not as easily as you can fix them in mixing. If you are still working on the mixing and can get back into each song and work out delay timings and such, then you stand a better chance. But if your mixes sound alright and you don't lose big chunks of the sound in mono, then you're probably going to be alright.

-Jeremy

User avatar
Fletcher
steve albini likes it
Posts: 395
Joined: Fri Jun 06, 2003 7:38 am
Location: M?nchen
Contact:

Post by Fletcher » Tue Dec 13, 2005 9:33 am

mn412 wrote:If you record a section of the drums and zoom in on the waveforms you can check and see if everything is in phase. I guess some would consider it cheating but its a good way to double check.
I wouldn't consider it cheating... but I would consider it exceptionally dumbass. Last I checked we l-i-s-t-e-n to music, we don't watch music. The minor timing differences between tracks will often be the thing that gives a recording "depth" and "dimension", if you elimiate these timing you will more often than not come out with something exceptionally flat, boring and 1 or 2 dimensional in terms of sonic and emotional impact.

It would be a far superior idea to check your audio in one speaker mono as has been suggested several times in this thread than looking at a got damn thing.

But who knows... YMMV...

Post Reply

Who is online

Users browsing this forum: MoreSpaceEcho and 63 guests