Volunteer With Video!

Practice makes perfect, right? The more we practice a skill, the more accomplished we become. This is certainly true of video production, which contains within it such a vast array of skills and challenges that every project seems to teach something new.

However, for photographers who are just beginning to work with video, it can be hard to get that practice. Before you approach your existing clients and offer them your services in the motion-picture domain, it’s helpful to have a few projects under your belt already. Unfortunately, most people will hire you to do something you haven’t already done, and it can be difficult to find friends willing to put in the time to be actors and crew for a first-time short filmmaker.

However, I have the answer: volunteer to do a promo video for a non-profit organization. Yes, that’s right: no profit for them, and no profit for you except the benefits of experience and practice. If you don’t have a favorite cause in your area, YouTube has a fantastic program that pairs charities with aspiring filmmakers. Here’s the link:


The beauty of this type of work is that it gives you a lot of creative autonomy, with a lot of client support. If you want to do a 30-second public service announcement or a 3-minute mini-documentary, you’ll probably be able to do just that, with as much help as you need. And, since you’ll be working with no budget, you’ll have to think on your feet, which is great experience. If you’re past the test-footage-of-your-dog stage of video, but not quite up to the pro level yet, this kind of practice is incredibly valuable.

Best of all, you’re helping other people while you help yourself. And that’s something we should all practice more!

Behind The Scenes – Piggly Wiggly TV & Print Ads

I recently had the pleasure of shooting both the Summer print and TV campaigns for Piggly Wiggly grocery stores with the same camera: the Canon 5D Mark II. “The Pig’s” stores are found throughout South Carolina and Georgia, and are a Southern institution. One was even featured in the Oscar-winning movie “Driving Miss Daisy.” It’s a great brand, and Piggly Wiggly is one of the largest clients of Rawle Murdy Associates, which is one of the largest advertising agencies in the area, and my biggest client, so it was a treat for me to double-dip this way.

Although the TV commercials and print/outdoor/web ads all shared a common concept: a whimsical play on “Piggly Wiggly” that features folks enjoying “icely creamly,” “sweetly tealy” and so on, the requirements for the deliverables were very different, so they were two separate shoots.

The TV commercial takes place at backyard barbeque, so the first challenge was location scouting. We had to find a house and yard that were upscale enough to be attractive, but not so ritzy that the target audience, which is primarily middle-class, wouldn’t relate to it. Once we found the perfect yard, I discovered a hidden bonus: a 60″ plasma TV in the garage that the family uses for tailgate events. The lady of the house was kind enough to let me use it for a client monitor, which, as you might imagine, was a huge hit with the client.

The only problem with using such a gigantic monitor was that we had to keep moving it so that I could see it from the camera position. Next time, I’m going to use an HDMI to composite convertor, and then split the signal to my field monitor and to whatever client monitor I use.

Henry Mathieu, the agency’s associate creative director was the writer/director for this spot, so I was the DP, not the director. But since it was my production company, I was able to assemble my favorite crew for this shoot. Len Spears is my right hand on shoots like this; not only is he a fantastic camera operator, gaffer and crane/dolly grip, he owns, or can build and/or fix any type of equipment imaginable. Doug Hall is an experienced shooter in his own right, but his background in a New York rental house also makes him an outstanding key grip/swing man. Michael Fischbach is an excellent photographer who assists for me on still and video shoots, as well as helping to make sure that everything is going smoothly. The newest addition to my hit squad is Adrian Westendorff, who does a little of everything, but primarily helps me out as a grip and assistant editor. And, it’s not just a boys’ party; Ashley Perryman is my pick for the best hair/makeup stylist in the region, and Carlye Dougherty is the best – and most frugal! – wardrobe stylist around. Both of the ladies were on the crew list for this shoot, as well as Reina Davis, food stylist extraordinaire, and Bren Monteiro, who produced both projects for the agency, and whom I’ve had the pleasure of working with on several jobs.

Because we were going to be shooting outside, I made the decision to start shooting in the afternoon, to take advantage of the long Southern “golden hour.” Around here, sunrise lasts about 30 minutes, and the sun seems to stay directly overhead from 9 a.m. to 4 p.m.. That overhead light is neither flattering nor easy to work with, so I wanted to shoot as much as possible, as late as possible.

While Henry and Bren positioned the extras around the background, I set up the camera on my HandySLR rig, and Len and the guys positioned large reflectors on stands around the yard. We had a couple of HMIs, which we used to fill in the darkest areas, but we tried to light as much as possible by bouncing sunlight where we want it. (Of course, the later it got, the faster the sun moved, so by the end of the shoot, Len and Doug were re-angling the reflectors after every take.)

We started off with an establishing crane shot. For this, we used a 12′ crane that Len built a few years ago. Low-tech, but rugged and perfect for this type of medium-budget shoot. Next, I shot a few cutaways of the food table hand-held. Next, we had a couple of shots on “sticks” (tripod), and finished up with an elevated dolly shot, again using a custom rig that Len built himself.

The weather smiled on us, the actors and crew were enthusiastic and upbeat, and the clients went home happy. It was a good day! Here’s the finished commercial: http://www.youtube.com/watch?v=dHP14dJiB5I

The photo shoot was much more contained. Basically we just needed a large yard, so we used a different location, and my only crew members were my assistant, Michael Fischbach, and Reina Davis the food stylist. The creative direction from the agency had been to go for a “hyper-real,” very poster-like look, so I lit the shots in a fairly stylized manner. I’m a huge fan of monolights, and shoot almost everything with no more than three Photogenic 1250s. In this case, I shot two monolights through a silk for the key light, and used the third, bare-bulb, as a backlight. The strong backlight gives these shots a very commercial, somewhat retro look, which fit the concept nicely. Here’s what one of the final photos looked like:

And here’s the billboard it appeared on:

Here’s another photo from the shoot, and the billboard it appeared on.


It’s always fun to work for an established brand, and I jump at any opportunity to work with my favorite team of pros. Plus, the fact that I shot regional print and TV with the same camera struck me as a nice example of how DSLR technology is changing the face of commercial production.

Audio Test: CX231 w/5D2 vs. Zoom H4N

Ever since Canon released the 2.0 firmware update for the 5D Mark II, I’ve been suspicious of the camera’s “manual” audio function. Using Magic Lantern http://magiclantern.wikia.com with the original firmware, I had been able to capture broadcast-quality audio (I’m speaking literally here: I used in-camera audio for, among other things, two regional TV spots that I shot and edited). However, on several shoots I’ve done with the new firmware, I’ve been noticing a hiss in the audio recorded in camera. On the shoots in question, I had the benefit of an audio operator running a top-of-the-line Sound Devices mixer/recorder. He recorded the sound on his hard drive, while feeding me the audio as a reference and/or work track. While I had planned to use the Sound Devices files for the final edit anyway, I was concerned that the poor audio quality I was hearing would be an issue for one-man-band shoots when I was flying solo.

Jon Fairhurst did some excellent tests comparing the CX231 to the Zoom (among other devices) while running Magic Lantern, but I haven’t seen any formal tests of the new firmware. So, I did a simple one.

I set up an Audio Technics AT897 shotgun mic, and recorded some test dialogue in a couple of ways:

1) Through a JuicedLink CX231, into the 5D2.

2) Into a Zoom H4N recorder.

3) Looped through the Zoom H4N (through the headphone jack on the Zoom) and into the 5D2.

You can download the audio from here: http://5dfilmmaking.com/audio_test.aif

And see a spectrogram analysis here:

Based on this test, I drew a few conclusions:

1) The hiss I heard in my field work doesn’t seem to have been caused by the 5D2. I suspect the wireless feed I was receiving from the audio operator, but will need to run separate tests using the Sound Devices to verify that.

2) The Zoom and the 5D2 both generate a lot of faint noise (in this image, red is signal and blue is silence. The purple haze around the vocal signal is noise.

3) The Zoom clamps the signal at around 20khz (theoretically not an issue, since that’s the the limit of “normal” human hearing, while the 5D2 does not.

4) The CX231 generates some sort of pilot tone close to 20khz. I can’t hear it, and I can’t explain it. I have a DN101 headphone amp/AGC disabler installed, but the tone was constant whether I had the DN101 on or off, and whether I took the output from the DN101 or directly from the CX231. I wrote to JuicedLink, and was pleasantly surprised to receive a very prompt and thorough reply: “The noise blip you see in the spectrograph around 20KHz is -50dB down, and is from the preamp switching power supply.  There is no need to remove it in post.  It’s not like the AGC disable tone which is injected at a high level to force the AGC lower and contains harmonic components.  With an A-weighting filter (the human ear), it is attenuated by another 50dB.  It is of no concern.”

5)  The Zoom generates slightly less noise than the 5D2 (for this test, I had the CX231 set to maximum gain, and the internal gain in the 5D2 set to the 1/4 mark to get a nice hot signal), but looping the unbalanced headphone output of the Zoom into the 5D2 is the worst of both worlds. Clearly, this would only be a backup track (in case somebody forgot to press the “Record” button on the Zoom, which of course somebody would never do), or a nice solid track for “pluraleyes” to sync with.

Overall, I was disappointed by how noisy the signals looked, but I was relieved to find that they all sounded pretty good. I’ll test the Sound Devices mixer to see if I can identify the source of the hiss, and update accordingly.

The 411 on 4:4:4 (and 4:2:0 and 4:2:2)

You often hear people referring to  digital footage as being “4:4:4” or “4:2:0.” If you have no idea what they’re talking about, you’re not alone. In fact, many people who think they understand those terms actually don’t.

What we’re talking about here is called Chroma Subsampling, and there’s a LOT of confusion about this topic. Most of it stems from the fact that there have been two different approaches to chroma subsampling, and both of them are written out the same way: 4:x:x. However, as brevity is the soul of wit, I’m only going to cover the more modern and prevalent system.

Let’s start at the beginning. An electronic image is composed of little squares called pixels. Each pixel can have luminance – luma – which tells the pixel how bright or dark to be, and chrominance – chroma – which tells the pixel what color to be. If you don’t have any chroma data, your image will be grayscale – black and white. But if you don’t have any luma data, you won’t have any image at all.

Now, to have an reasonably good picture, every pixel needs to have its own luma data. But some clever engineers figured out a long time ago that every pixel does NOT need to have its own chroma data. You can save a lot of space by forcing chunks of pixels to share the same chroma sample – basically, to be the same color. And that process is called chroma subsampling.

Different flavors of chroma subsampling are written out using the format “J:a:b”

The first number “J” tells us how many pixels wide the the reference block for our sampling pattern is going to be.

Chroma Subsampling - J:A:B

Sometimes it’s eight or three, but usually it’s four pixels wide.

The second number tells us how many pixels in the top row get chroma samples, and the third number tells us how many pixels in the bottom row get chroma samples.

4:4:4 Chroma Subsampling

As you can see here, if every pixel in the 4×2 grid gets a chroma sample, there’s actually no subsampling going on, and the scheme is 4:4:4. This is what’s used in high end HD cameras like the Panavision Genesis and Sony F35.

4:2:2 Chroma SubsamplingNow let’s take a look at 4:2:2. Every two pixels on the top row share a chroma sample, and every two pixels on the bottom row share a chroma sample. We’ve definitely lost a lot of detail, but we can still get an idea of the original image. This is the subsampling used in Panasonic cameras that record in DVCPRO HD, and Sony cameras that record in XDCAM HD422, as well as in editing codecs like Apple ProRes 422.

4:2:2 Chroma SubsamplingNow let’s take another step down and look at 4:2:0. Our “a” number is still 2, so every two pixels on the top row still share a chroma sample … But the “b” number is zero, which means that the pixels in the bottom row don’t get anything of their own. So, they have to share with whatever’s above them. You can see how much information is lost here. This is the subsampling used in DVCam, HDV, Apple Intermediate Codec, and most flavors of MPEG, including the ones generated by Canon DSLRs.

Looking at this diagram, you can see one of the main reasons why formats with heavy chroma subsampling give you blocky artifacts. What you’re seeing is actually chunks of pixels that are sharing chroma data and being forced to be the same color, to save space. And, of course, this isn’t even taking into account the other aspects of image compression, which can make this blockiness even worse.

This really becomes an issue when you talk about pulling a chromakey. Think about trying to pull the green pixels out of a shot of smoke, or wispy hair. It would be fairly easy if each pixel has its own chroma sample.

But it gets much harder when pixels are sharing samples, because the green pixels aren’t necessarily at the exact edge anymore. This is why you get those jagged lines around the edges of chromakeys with subsampled footage.

Of course, there are a lot of other factors that figure into the quality of an image, and chroma subsampling is only one of them. However, it’s one of the least straightforward to understand, so I hope this has been helpful.

The Importance of Audio

Photography and video have a lot in common. Image composition, lighting aesthetics and the shot-based production process apply to both fields. Of course, there are a number of technical differences – lighting instruments, camera support equipment and postproduction workflow to name a few – but these challenges generally aren’t too painful for the ambitious shutterbug. There is, however, one major aspect of video production that can be most painful indeed: audio.

Good audio won’t save a bad shot, but bad audio will definitely ruin a good shot. Noisy, echo-ridden, or distorted audio instantly labels a motion picture as an amateur effort. Regardless of the quality of the image, a project with bad audio will struggle to keep an audience’s attention.

I know what I’m talking about here, because I learned it the hard way. Indeed, it was one of the last lessons I learned in film school. While shooting my final project at the Savannah College of Art and Design – a 12-minute narrative, shot on 16mm film that was supposed to be the culmination of everything I’d learned over the last four years – I focused entirely on the visual components of the piece, learning – only too late – that the audio had been compromised.

After three months of writing, planning and preparation, my crew and I (all students) shot the project in three consecutive days, using the same equipment each day. At that time, the standard device for capturing dialogue was a DAT (Digital Audio Tape) recorder. Normally, DAT is pretty much bullet-proof, and the classmate who was acting as sound recordist had quite a bit of experience working with DAT equipment, so I had no reason to suspect that any issues were occurring. (In retrospect, this was a naive assumption on my part, and one that I have tried not to repeat too often. When it comes to audio, paranoia is often justified.)

While recording, the DAT sounded fine, but – as I found out after the shoot was over – something was going wrong with the tape itself. We never figured out exactly what had happened, but I remember thinking that my magnum opus sounded like it was being played back from a bad answering machine. The audio was terrible. There was no way to salvage it, I couldn’t afford to reshoot it, and I couldn’t use it. So, I had to bring in all my actors to “loop” their dialogue – to record their lines in a sound booth, and then piece everything together like a jigsaw puzzle, syncing the dialogue as well as I could.

In real movies, this process is called ADR, and the studios have the expertise and equipment to do it properly. I had neither. The process took forever, and when I was all done, my one chance to show Hollywood what a brilliant filmmaker I was, the project for which I had passed up the opportunity to be the best man in a good friend’s wedding because I felt that I couldn’t spare the time … Well, it looked like an overdubbed low-budget foreign film. The magic of the original performances was lost. The subtle sounds of cloth rustling and body movements had been awkwardly replaced by “foley” effects recorded in a sterile environment. The ambiance of the locations and the aural interaction of the characters was gone forever.

To say that it was a disappointment would be a drastic understatement. It was a crushing, devastating failure. And all because of audio.

This may seem like an extreme example, but I suspect that anyone who has worked with video for any length of time could share an experience as bad, or worse.

If you’re just getting into video, do yourself a favor and spend some time learning about audio. Get a handle on the difference between line and mic level signals. Read up on the different types of microphones, their pickup patterns, and their proper usage. If you’re shooting with a DSLR, get an external audio recorder, and get used to working with it. And, as I learned on that film project, check your playback as often as you can, to make sure that what you’re hearing is what you’re recording.

Working with audio is like swimming in the ocean. You don’t have to fear it, but you do have to respect it. And, just as you can charter a vessel in unfamiliar seas, you can hire a freelance audio engineer to work on complex or critical shoots. But what you can’t do is ignore it or muddle your way through it. Audio is a complicated subject, and any photographer starting to work with video needs to have at least a basic understanding of how to capture it properly. If you make an effort to learn what you’re doing, your projects will benefit. If you don’t, it’s only a matter of time before you’re telling your own story about a big shoot that got ruined by bad sound.

Lenses: Zooms vs. Primes

Without getting into the science of optics, there are a couple of pretty straightforward differences between zooms and primes.

The most basic difference, of course, is that a zoom lens can be adjusted to a range of focal lengths – you can zoom in and out – while a prime lens has a fixed focal length.

So, you may well ask, why would you ever want to use a prime lens?

Well, the next difference between zooms and primes helps us answer that question. All else being equal, a prime lens has fewer optical elements – fewer actual pieces of glass – inside of it than a prime lens.

Typical Prime Lens Optics

Typical Zoom Lens Optics

Aside from making the lens smaller and lighter, which can be an important consideration under some circumstances, fewer elements means two things:

#1, the image is sharper. After all, looking through one window is clearer than looking through a stack of windows. Of course, a really good zoom lens may well be sharper than a really lousy prime lens, but generally speaking, the more pieces of glass you have, the more your image is going to get diffused.

#2, the lens if faster. Even clear glass absorbs some light. That’s why prime lenses almost always open up farther than zoom lenses. This is a fast zoom lens. It opens up to 2.8. But this prime lens opens up to 1.4. A full two stops more.

So. The advantages of zoom lenses are versatility and convenience. And the advantages of prime lenses are clarity and speed. Which you “should” use depends on what you’re doing.

Understanding “Film-Style” Shutter & SLR video

One of the most common issues that newcomers to SLR video struggle with is shutter speed. What shutter speed is most “cinematic,” and why?

To understand shutter speed for SLR video, it helps to understand how the shutter works on a traditional motion-picture film camera.

Film camera shutter open

film camera shutter closed

Inside a film camera, there is a rotary shutter, which is shaped like a semicircle. When the camera operates, the shutter mechanism turns continually; when it is “open” (letting light hit the film), an image is being exposed. When it “closed” (blocking light from the film), the next frame is being moved into place.

Regardless of what speed the film itself is running through the camera, half the time the shutter is open, exposing the film, and half the time, the shutter is closed, advancing the film.

This is important, because it means that with a film camera, your shutter speed – how long each image is exposed to light – is half your framerate – how many images are exposed each second. So, if you were shooting at the cinematic standard framerate of 24 frames per second, your shutterspeed would be half that – 1/48 of a second.


Rotary Shutter at 45 degreesNow, I’ve cheated a little bit by referring to the shutter as a semicircle. In reality, many motion-picture film camera shutters are adjustable. Imagine two semicircles pinned together: the amount of space left open could never be MORE than 180 degrees (half a circle), but it could be LESS, if you fanned the two semicircles out so that they left a smaller space exposed. That angle of exposed space is called the shutter angle. The standard shutter angle is 180 degrees – half a circle. But, for aesthetic or logistical reasons, sometimes angles of less than 180 are used, which means that the amount of time each frame is exposed to light is actually less than half the framerate.

What this means is that, if you set the shutter speed on your DSLR to more than half of YOUR framerate, you are simulating a smaller shutter angle. It’s not necessarily bad, but you need to be aware that is an aesthetic decision.

So, bottom line: the traditional standard for narrative filmmaking is a framerate of 24 frames per second, with a 180 degree shutter, which means a shutter speed of 1/48. So, if you want to do narrative filmmaking with your DSLR, you want to stay as close to that as you can.

But, keep that “half your framerate” shutter speed formula in mind. If you’re shooting at significantly higher framerates, and you want to stay faithful to the cinematic standard, you’ll need to adjust your shutterspeed accordingly.

One caveat: If you’re shooting under artificial light, and you start noticing strange banding or strobing in your image, it’s probably because your lamps are cycling at 60 khz, while your shutter is at 1/50. To compensate for this, set your shutter speed to 1/60.

With all that said, remember that rules are made to be broken. Absolutely terrific videos have been shot with no regard to “correct” shutter speed. Here’s a great example of footage shot with a shutter speed around 1/1000:

Whose blog is this?

My name is Alex Fox. I’m a commercial photographer, producer, director, cinematographer and editor. In other words, I’m fortunate enough to have very nice, smart, good-looking people hire me to make pictures for them.

I’ve done work for international brands like LeCreuset cookware, regional icons like the Southern grocery store chain Piggly Wiggly, small local businesses, and everything in between.

For years, I was frustrated with the image quality I could achieve with conventional video cameras. When I graduated from film school in 1999, DVCPro was the state-of-the-art format, and nobody had heard of High Definition.

When Sony released the Z1U, the first affordable HD camcorder (using the HDV format), I jumped on it. The quality was indeed much higher than anything I’d worked with before, but it was still far from 35mm or even 16mm film. Eventually, I upgraded to the full-size Sony S270U, and started using 35mm lens adapters, so that I could get a more cinematic look to my video work. The adapters worked, but the technology was cumbersome, and the quality somewhat unsatisfying.

At the same time that I was working on video projects, I was doing commercial photography work. My camera of choice was the Canon 5D, which captured superb images that – in my mind, certainly – were as good or better than 35mm film. Then, in late 2008, I heard that Canon was releasing a Canon 5D Mark II which was going to have a more rugged body, a higher megapixel image capture … And video capability. I wasn’t sure what this would mean, until I saw Vincent LaForet’s now-legendary short film, “Reverie.”

I immediately realized the implications of this technology. The price-point obstacles that had separated small-time, one-man-band operations like mine from the high-dollar production companies had just gone up in smoke. For the first time in the history of technology, a viewer could not tell whether footage had been shot on a $200,000 camera of a $2,000 camera.

In 2008/2009, my business had been hit very hard by the recession. I couldn’t afford to buy the “5D2,” but I couldn’t afford not to be an early adopter of this incredible new technology. Finally, my wife put an end to my suffering by buying the camera, and putting the purchase on her personal credit card.

Once I started working with the camera, I spent a fair bit of time online, at forums like Cinema5D.com. I noticed that people were asking the same questions over and over. What shutter speed should I use? How do I get decent audio? What kind of tripod should I buy? I also noticed that there were certain questions that 5D2 users were NOT asking. Questions about how to plan a shoot, construct a shot list, or format a script. Aspiring filmmakers were jumping into the deep end without any clue what they were doing, and they were wasting incredible amounts of time.

So, I decided to take my film school degree and my 10+ years of experience and pour them into a crash-course training video that would give beginners all the information they needed to hit the ground running, regardless of the type of projects they were pursuing. Music videos, documentaries, short films, commercials … The creative direction might be different, but the fundamentals of production are largely the same. The fruit of these labors was 5DFilmSchool.

My goal had been to be the first person to release a comprehensive training course. As it turned out, I was the second. Philip Bloom, who had been an even earlier adopter than me, beat me to the punch by about two weeks. Still, 5DFilmSchool got great reviews, and I sold enough copies of it to compensate me for the time and sweat I had put into it. (And I mean sweat … I recorded the program in my studio during a South Carolina summer, and I had to keep the air conditioning off to get clean sound. It was not fun!)

The most rewarding part of the 5DFilmSchool project came when I started receiving emails from satisfied customers who shared with me the videos that they had made using the techniques they had learned from my videos. A guy who did a promo video for the Orange County Sheriff’s Department. A Hawaiian musician who created a fantastic music video with his wife. Students and photographers who were proud of the work they had done, and who thanked me for helping them. It was the most gratifying experience of my professional life.

Three years have gone by since I started working on 5DFilmSchool. In that time, the whole landscape has changed. The 5D2 is no longer the only kid on the block, as a veritable pantheon of HD-capable DSLR (Digital Single Lens Reflex) and EVIL (Electronic Viewfinder, Interchangeable Lens) cameras have flooded the market. Certain issues – the lack of manual controls and industry-standard framerates – have been resolved, and others – issues with audio, moiré and aliasing – have become more apparent.

Because microbudget filmmaking encompasses so much more than SLR cinematography, I replaced 5DFilmSchool with an eBook modestly entitled “Make Movies Without Money”. As the rate of change continues to accelerate, I’ve found that blogging is the best way to share my ongoing observations about techniques, principles, and equipment. I hope you find my posts useful and entertaining.