Teacher of the Year Laura | RI Studio Headshots

Laura is a good friend of mine who works as an art teacher in North Providence. She just happened to win teacher of the year at her school and is now prepping to enter into the citywide contest. From there she’ll go to regionals, then nationals, then wear the teacher of the year crown until next year.

 

Part of her application packet was a headshot requirement. This is where I come in. She said a black background was fine so Saturday afternoon I started setting up some lighting in the basement with a black backdrop. I then proceeded to annoy Tara by making her sit there while I adjusted everything. She lost interest after awhile and I resorted to the self-timer and running into the scene to try a self-portrait. She then came back and helped me out a little more.

 

When Laura showed up, the whole setup was ready to go. The photo session took literally four minutes. We dumped the photos to my computer, picked the one she liked most, and then we sat together while I edited. It was kind of fun to edit while the person was sitting right there. Her shirt was actually hot pink in real life, but we changed it to blue because we both thought it looked better. Scroll down for some behind the scenes info.

 

Rhode Island Studio Headshots - Laura

 

A little behind the scenes for you photographer nerds out there (and you strobists) – All the lighting was done with speedlights. Below you can see the setup. #1 is an SB600 on a boom with a shoot through umbrella, #2 is a bare optically triggered YN560, #3 is a junky bare Sunpak, #4 is a bare SB600 behind Laura’s head pointing back at the camera, and #5 is a piece of white foam board carefully balanced on our laundry basket and our deep freezer basket to fill in some shadows from below. As for the rest of the stuff in the picture, well, I’d like to see how clean your basement is. Please notice the professional seating device – an old CRT TV. I would have liked to have her stand, but the ceilings are too low to set the lights up for a standing adult. Three lights were triggered with wireless triggers and #2 was optically slaved (I ran out of triggers).

 

As you can see, we don’t have a real studio space, but this should serve as an example of how you can make do with what you’ve got to get pretty good results. Speedlights used correctly can look like you’re using much larger studio strobes. Experiment!

 

Rhode Island Studio Headshots - Laura

 

Rhode Island Studio Headshots - Laura

 

Rhode Island Studio Headshots - Laura

 

Rhode Island Studio Headshots - Laura

 

A How-To Guide: Photo Restoration, Colorization, and Retouching

I thought we might take a step back from weddings today so I could share with you few little things that may be useful. I recently had someone inquire about photo restoration services. He had a couple childhood photos from years ago that he wanted to get printed but one was in pretty tough shape and the other was black and white. He wanted to restore the color photo and colorize the black and white one.

 

This is not something I normally do, but I always love to figure out and learn new techniques in Photoshop so I told him I’d give it a go. The restoration of the color photo wasn’t too difficult (though it is time consuming) since it employs a lot of the same techniques as retouching a fashion photo would. The colorization was something I had never done before.

 

Let’s start with the restoration. Below you can see a before and after. I’ll just run through a high-level description if the process since there’s already millions of specific Photoshop tutorials out there on how to use the specific tools within.

 

Photo Restoration

The first step, if the image is intended for print, is to crop to the proper aspect ratio. In this case, the intended print size was 8×10. Next, the background was removed by masking it off and deleting it. A layer was created below the main image and placed a black and white radial gradient on it. Any background you want could be used in this step. After that, a new layer was created on the top of the stack and the heavy lifting starts. All of the deterioration of the scanned photo needed to be removed – the rips, crinkles, folds, stains, etc… This is easier in some places than it is in others. A lot of time and patience pays off here as you move through the photo with the spot removal tool and fix each blemish one by one.

 

Once finished, it’s time to color correct the image as the original had extremely strong red hues. I added a layer of blue, changed the layer style, and adjusted the opacity to balance the red. I also added a hue/saturation layer, a contrast layer, and a levels adjustment to fix some of the toning. That was about the extent of it. Here’s the after and before (yes, they’re backwards):

 

Blueflash Photography Wedding

 

Photo Colorization

I’ve never colorized a black and white photo before, so I wasn’t sure what to expect. I started with the same steps as the photo restoration – cropping for the proper aspect ratio, removing the background, and adding the new gradient background to match the previous photo. I then started making masks for each individual piece of clothing, along with grouping all the skin areas of the photo. This made it easy to apply different colors to different parts of the photo. Once all the masks were created, I was able to simply make a hue/saturation layer for each different part, apply the appropriate mask to that layer, set the ‘colorize’ option, and adjust as necessary. It actually was much easier than I anticipated. The hardest part was deciding what colors to make the clothes. I had to consult Tara for that. Below is a screenshot of the Photoshop layers and below that is a before and after.

 

Blueflash Photography Wedding

 

Blueflash Photography Wedding

 

I hope that was either useful or interesting to you. I am by no means an expert in photo restoration, but I figured I’d share my experiences in case someone else out there is trying to do the same type of thing.

 

are we done yet?

this is the mini-series roundup. i’ve got to end this series so i can talk about other stuff, so i’m going to double post dynamic range with RAW/jpg shooting.

let’s start with dynamic range. this is the range of intensities the camera can collect in a photo, from the darkest dark to the brightest bright. the best camera can only see about 15% of the amount of dynamic range than the human eye. God : 1, nikon : 0. this is why when you snap a picture of a bedroom with a window during the day you either get the room exposed properly and the window totally blown out, or you get the window exposed properly and the bedroom very dark.
there’s a technique called HDR (high dynamic range) photography. in it’s simplest form, you take multiple photos at different exposure settings without changing the camera position, aperture, white balance, or focal length. that leaves us with using the shutter speed to control the exposure. the idea is to take a photo that properly exposes the darkest darks, another photo that catches all the mids, and another photo that captures the brightest whites properly. you then take all the photos and blend them together so that every part of the photo is exposed. you can also tone-map the colors to give it a surreal feel. tone-mapping is another whole animal that probably only 1% of people would care about so i won’t go into it. here’s a couple HDR photos i took of the nylo hotel in warwick, ri

cool, huh? you can get some pretty cool looking shots if you do it right. HDR doesn’t lend itself well to pictures of people or animals or anything that moves since all photos have to have the identical content but at different exposures.

enough of that. onto RAW and jpg shooting. this could get ugly, but i’m going to refrain from too much tech talk. here’s the super basic version – RAW allows you to capture much more dynamic range in a photo. it is uncompressed, the data exactly as your camera sensor sees it. every camera shoots in “RAW” internally, but all point-and-shoots add some processing and convert it to a jpg before saving it to your memory card. it’s kind of like the digital equivalent of a film negative. in post processing you can determine the white balance and exposure without any image degradation. a jpg will always degrade with every edit made to it. the drawbacks to RAW is that they take up a lot more room on your memory card, each camera manufacturer has their proprietary RAW protocol, and every RAW file needs post-processing in order for it to be useful. RAW files on their own are very bland, desaturated, and not so sharp. the possibilities they allow you in post-processing is the huge benefit. they take time to deal with and hard drive space to store.

i always shoot RAW. my camera has a RAW+jpg option which will capture both formats from a single shutter press. if i’m taking casual photos i’ll use that option in case i catch something i really want to tweak so i’ll have the RAW file. otherwise i can just store away all the jpg files and i’m all set.

there’s more to the RAW/jpg debate, but it has to do with bit-depth and aliasing and histograms. hopefully the info in this post is enough for you to decide if you want to jump into the RAW world. adobe bridge and photoshop come with a background program called ‘camera raw’ which is capable of processing all the major manufacturers’ RAW formats

just a note – i wouldn’t recommend attempting HDR unless you’re shooting RAW. there’s not enough information in the jpg files to make the image come out nicely.

that’s it! the mini-series is over. a bit of a relief for me, i’ve got other stuff i want to start blogging about. an easy way to find all the posts that were part of the mini-series is to look on the right hand side under “categories” and click on ‘education’. all the posts are under that category. go crazy. thanks for sticking with me through all of these educational posts!

freakin’ out with flash

unfortunate group of people getting their picture taken at the bowling alley by someone with their pocket camera: “the flash didn’t go off!”
picture taker: “oh sorry let me take it again”
unfortunate group: “the flash didn’t go off again!”
picture taker: “ummm, i don’t know why this dumb thing isn’t working. let’s go see what these bowling shoes look like under the black lights!”

that probably sounds like a familiar conversation you often find yourself in. why doesn’t the flash go off? why do we care so much about the flash? when a camera flash fires, it lets off an extremely intense and very short burst of light. the shutter opens up before the flash fires and closes when it’s done. if you want to understand what the camera sees during a flash photo, grab a friend and go in a windowless room in the house where it’s pitch dark. have your friend stand on the other side of the room and wave their arms like a maniac. stare in the general direction of your friend making a fool of themselves and flip the room light on and back off again one time as fast as you can. on off. what did you see? if you did the light fast enough, you saw a quick snapshot of your buddy with their arms in a fixed position even though they were in constant motion. for the split-second that the light was on, their arms didn’t move too much, so to you it appeared that they were still. that’s what the camera sees (and records) on a flash photo. this is why it’s near impossible to get ghosting or motion blur in a photo using flash. that’s a good thing. on point-and-shoots, flash is your friend, quite possibly even your best friend.

if you have a good enough dslr, enough light, a fast enough lens, and have read the other posts in this mini-series, then you should be able to take a photo of motion and freeze it without using flash. otherwise, flash does the job.

another common flash issue is that it turns your friends into evil looking monsters by making their baby blue eyes into red scary fireballs. the cause of redeye is not a mystery, contrary to what paranormal seekers would like to believe. it’s simply a reflection of the back of your eyeball, specifically the blood vessels on the retina, when hit with direct and intense light. if the flash is off-axis with the eyes, then your redeye will disappear. i understand with a point and shoot this is difficult to acheive because the flash is fixed and it points in whatever direction the camera is pointed. newer cameras have various methods of redeye reduction built into them which all but eliminates the problem. if you have a dslr with a speedlight (one of those add-on flashes stuck to the top of the camera), then you can modify the direction of the flash. point that thing at about a 60 degree angle toward the ceiling and you’ll get a nice bounce to illuminate your subject and get rid of redeye. if you have a speedlight, do a lot of experimenting with bounce angles. you’ll be surprised to see the differences in the various angles.

another nice way to soften the light is to use a diffuser. you can buy one for anywhere from three to probably over a hundred bucks. you can also use a piece of wax paper or tissue paper, even kleenex or toilet paper. be creative – when it comes to lighting it sometimes helps to be unorthodox and try weird stuff. place whatever material you have over your flash and see what the results are. anything goes, as long as it’s not completely opaque. you can even use colored material for strange pictures if you feel so inclined. diffusers work great, but i don’t recommend spending a lot on one. i bought one that fits on my sb-600 speedlight and it cost me $2.99 on ebay. i would argue it works just as well as the $100 version. it’s not rocket science, just plastic. some things are worth spending the money on – this is not one of them. put the saved money toward a better lens.

the rest of this post is exclusive to dslr cameras. i haven’t yet seen any point-and-shoots with the option to adjust the following parameters, but i suppose they could exist. external speed lights allow you to change the intensity of the flash. this is handy if you want to use flash but not have it overpower the photo. this can allow you to obtain subtle highlights without it looking unnatural. again, play around with it, it’s the only way. here’s a couple photos that show the difference between direct flash and bounced flash. notice how the direct flash photo has underexposed areas around where the flash directly hit and the indirect flash photo doesn’t. also notice the section of my basement that has a bunch of random stuff.

direct flash
bounced flash

there are usually two main flash modes available. one is front curtain and the other is rear curtain. the term curtain comes from old analog camera technology that we won’t go into. what it controls is if the flash fires right after the shutter opens or right before the shutter closes. the easiest way understand the effects of this is just to see it. both of the shots were taken with a 2 second shutter speed while panning the camera from the right to the left. check out the photos:

front curtain

rear curtain

what’s happening is that the floor and the candle get captured when the flash fires, but the flame is captured throughout the two seconds because it is its own light source. when the flash fires at the beginning of the shot, the flame lags behind the candle. when it fires at the end of the shot, it captured the flame already and ends the shot with the candle.

one last thing worth mentioning is something nikon builds into their speedlights and higher end cameras called the creative lighting system. it allows you to set up a bunch of speedlights in a room and wirelessly fire them from the camera. this lets you get all sorts of cool light angles, but you have to own multiple speedlights (a few hundred dollars each) and a camera capable of commanding the speedlights. i believe canon just recently came out with a similar capability, but i primarily shoot nikon so that’s what i can speak to. if you end up diving into that world and you’re stuck, drop me a line and i can help you get it up and running.

i really do intend for these posts to be short when i start them but they end up growing pretty large by the time i’m done. trying to keep it basic but i also want to be thorough. questions? let me know if anyone tries anything funky for a diffuser…

next up: dynamic range/HDR

focal length funk

in its most basic explanation, focal length is how much you’re zoomed. the focal length most closely equivalent to that of the human eye is about 75mm. that’s not to be confused with the 35mm of ’35mm camera’, which is referring to the film gauge. those are two different things measured with the same unit (millimeters). what the 75mm focal length means to us is that when you put the camera up to your eye, the image you see won’t appear zoomed in or out but rather it will look just like it does without the camera.

alas, there’s more to life than 75mm. lenses come in two main flavors – prime and variable. variable lenses allow you to change the focal length. a common kit lens with entry level dslr cameras for both nikon and canon is an 18-55mm. that means you can use the lens at any focal length between 18mm (considered wide angle) all the way up to 55mm, which will appear a little bit zoomed out compared to the naked eye. variable lenses are nice for obvious reasons, allowing you to very quickly adjust your focal length on the fly without swapping lenses. your other choice is a prime lens. an optimus prime lens has a fixed focal length. you may immediately think that to be a huge disadvantage. you may also think that you have a point-and-shoot and you can’t swap lenses anyway. well, the second statement may be true, but the first is not. while primes don’t afford you the luxury of quick focal length changes, they do have some advantages. there are three main advantages to primes: they are usually much sharper than variable lenses, they typically weigh less, and they often afford you better aperture for less money than a variable.

here’s an example to show the difference between a shot from the same distance at 80mm and a shot at 200mm

80mm shot
200mm shot

a little background in lenses: this is where i would attempt to flaunt my minor in physics, except i honestly don’t remember too much from it. i did a lot of sudokus in college. light bends in a very predictable way when it passes through concave and convex glass. those pieces of glass can be lined up in series so that each successive lens is fed light from the one that supercedes it. depending on the setup of the glass, one can concentrate or expand the light however they want within the laws of physics. remember those funny convex mirrors they used to have in the corner of department stores to make sure no one was pocketing a sweet nike t-shirt? it’s kind of like that except rather than the light bouncing back like on a mirror, the light passes directly through.

camera lenses are often internally made up of multiple pieces of glass. a prime lens has all of its glass in fixed positions. a variable lens achieves different focal lengths by allowing you to move certain pieces of glass closer together and further apart from one another (usually done by turning the outside housing of the lens clockwise and counter-clockwise). the sacrifice of the allowance of motion in the glass is that the alignment is not quite as precise as the prime lens. this reduces the sharpness of the lens. the added weight of a variable lens is due to the extra glass necessary to make adjustments.

alright, that’s probably more about focal length and lenses that any normal person would care to hear about. there’s actually a little more about how the focal length changes the amount of ‘compression’ in the photo, but i think that’s overkill for this mini-series. as always, if there are any questions just drop them in the comments. i’d also love to hear if this mini-series is helpful to any of you guys or if tears of boredom are streaming down your faces.

next up: flash

new year’s resolution

i’m sure you’ve seen the bumper stickers… “my dad has more megapixels than your dad”. they are mean-sprited and hurtful, and even though i am a proponent of free speech, i think they should be banned. although it has subsided a bit, there is much buzz to be had about megapixels. it’s really nothing more than marketing hype. i would say that the casual picture taker is fine with anything more than 3 megapixels. you can very safely print an 8×10 with a 3 megapixel file. really, are you going to be putting those pictures of your dog on a highway billboard? most likely not, so there’s no need for 15 megapixels. on the other hand, more megapixels does afford you the ability to crop your photos after with less loss of quality.

** warning – there is math in the following paragraph **
wait a minute, what is a megapixel anyway? well, a pixel is short for “picture element”. it’s basically a dot. a megapixel is 1,000,000 pixels. actually it’s 1,048,576 pixels (2 to the 20th power), but who’s counting? a really low resolution picture could be 800 pixels by 600 pixels. multiply those together and that gives you the number: 480,000 pixels, or roughly a half of a megapixel. let’s keep with the standard aspect ratio of a typical photo which is 4:3 (800 over 600 gives a 4 to 3 aspect ratio). at 10 megapixels you have (10 (megapixels) * 1,048,576 (pixels in a megapixel) = 10,485,760 total pixels). think of it like a rectangle with a 4:3 aspect ratio and you get two equations:
(1) Area (in megapixels) = X * Y
(2) 4/3 = Y/X
take equation 1 and solve for X
(3) X = Area/Y
sub X into equation 2
(4) 4/3 = Y/(Area/Y)
solve equation 4 for Y
(5) Y = sqrt((4*Area)/3)
sub in megapixels into Area and solve for Y
(6) Y = sqrt((4*10,485,760)/3)
(7) Y = 3739 (rounded to nearest whole number)
sub Y into equation 3 to get X
(8) X = 10,485,760/3739
(9) X = 2804 (rounded to nearest whole number)

there you have it – a 10 megapixel shot will give you a photo with the dimensions of 2804 by 3739. now that i’ve written that out i realize it’s pretty close to useless, but maybe it’ll help you understand what all the numbers mean and where they come from.

before you start drooling over the pixel count on some new camera that you want, i must warn you that more is not always better. the sensor in the camera is only so big and the pixels can only be so small. in order to fit more pixels onto the sensor, they have to get squeezed way closer together to use every bit of sensor real estate available; the result is that the pixel density increases. i’ll use an illustration to explain: imagine setting up bowling pins in a row evenly spaced 2 feet apart from one another. when you roll the ball toward the pins, chances are you’re only going to hit one pin since the ball is not two feet in diameter. the ball is the photon of light and the pins are the pixels on the sensor. this would be a low-density setup. now get those pins so that they’re all touching one another with no spacing between them. roll the ball again. you’ll probably knock down three or four with the one ball. if you were using the amount of pins that fell down to determine how many balls were thrown, that would be a poor representation because it would be indicating three or four balls were thrown. this is roughly equivalent to the photons of light spilling over into adjacent pixels because they’re too dense. on dslr cameras and probably even some point-and-shoots, the pixel density is specified. take a look at it when you’re doing your comparisons – it’s a much ignored but important camera specification.

next up: focal length

white balancing act

sometimes the color is off so much in a photo that people look like oompa loompas from charlie and the chocolate factory. once in awhile your friends might end up looking like smurfs. what gives? the white balance is off. modern cameras do an awesome job of calculating white balance, especially when flash is used, but there are times when it can get messed up.

the way it works technically is that the camera (or computer if you’re doing post-processing) does its best to pick what it thinks is supposed to be a neutral gray tone in the picture. once it determines that, it bases all of the other colors off of that gray tone. if the camera picks the wrong color for its nuetral gray, you can see why the colors would get skewed. here’s the same photo with different extreme white balances:

color temperature : 2450K

color temperature : 10100 K
notice the labels on the pictures. white balance is measured on a kelvin scale, which is a temperature. it’s the same temperature scale on the boxes of fluorescent light bulbs at lowe’s or home depot. the higher the temperature, the warmer the color (orange and red hues). conversely, the lower the temperature, the cooler the color appears (blue and greenish hues). a textbook photo has a perfectly balanced white point for true color reproduction. you can purposefully alter the white balance from its neutral point to emphasize something in the photo that you feel is there. color is a strong tool because it can evoke different emotions and feelings. subtle differences can make a big impact. take the next three photos for example. same photos, different white balances. do you see and feel the difference?
color temperature : 2850K

color temperature : 4550K

color temperature : 9900K

you can put white balance on auto mode or you can set it manually. even point and shoots will allow you to change the white balance, they just give it cute code names like “portrait, night scene, indoors, cloudy, etc…” on dslr cameras they just let you choose the color temperature.

i’m not going to delve into the jpg and raw shooting differences, but i do want to mention that if you shoot in jpg (all point-and-shoots force you to, dslr cameras give you a choice) then you cannot non-destructively alter the white balance in post-processing. you can change it slightly without noticeable degredation, but it’s not the best. if you shoot in raw, you can use the entire spectrum of color temperature during post-processing in a non-destructive manner. there are reasons why but i’d hate it if my blog readers were all passed out in front of their computer because i started talking about image compression.

next up: resolution

ISO – how sensitive are you?

you may have heard the term ‘ISO-9001 certified’ before in some boring business conversation or seen it stamped on a big business ad somewhere. ISO stands for international standards organization. they set guidelines and absolute measurement scales for things so everybody can be on the same page. one day, long ago, cameras used to have this thing called “film”. it was a thin plastic-like material that you shoved inside the back and the picture was saved onto it. this so-called “film” had speed ratings, which were measured in ISO numbers. the speed of the film was actually referring to how sensitive it was to light. the higher the number, the more sensitive the film, the quicker it could get enough light to expose properly which would allow you to shoot in low-light with faster shutter speeds. remember shutter speed? conversely, a low ISO number like 100 would be pretty insensitive to light. this would be the stuff you’d load into your camera if you were shooting outdoors on a bright sunny day.

ok, who cares, no one uses film anymore, right? that’s pretty much true for 99% of people. we still care because digital cameras also have an ISO setting. in our fancy shmancy digital cameras there is a “sensor” that captures the image rather than using a roll of film. the sensor gets blasted with photons of light when the shutter opens and it records the intensities of the photons on each tiny subsection (these subsections of the sensor are called pixels – more on this in the future post about resolution). the ISO parameter controls the sensitivity of the sensor, the parallel of the film speed in analog world.

still wondering why you should care? i’ve talked to a few people who say their pictures keep coming out “grainy” or “blotchy”. this is referred to as “noise”. i will spare you the origin of the term or else i’ll start talking about least significant bits and resistive ladders and delta-sigma modulators. generally the higher the ISO, the more noise will be introduced into your photo. noise most typically shows up as pixels being blown out completely. in addition, at very high ISO levels there can be severe color degredation and other negative effects. if your photos look real bad, check to see if your ISO is manually set high. keep in mind that the better the camera, the more capable it is to produce clean photos at higher ISOs. if you’re shooting with an entry level dslr, you probably will start seeing degredation once you pass ISO800. with a point and shoot you’d want to have it set to auto and let the camera calculate what it needs. the exception to this rule is if the camera keeps producing poor quality photos because the flash is off, there’s not enough light, and the only option it has is to jack up the ISO so it has a prayer of exposing the picture properly. in this case you’d probably want to enable flash so the ISO can be reduced.

here’s an example of a clean image shot with ISO200

here’s an example of a noisy image shot with ISO6400

the easiest place to see the difference is on the hardwood floor and the back wall. coloration varies between the photos and most textures appear pixelated. the differences are easier to see when the photos are larger.

now you know about shutter speed, aperture, and ISO. these are the big three for cameras. go mess around with your camera – the only way to learn is to try. good luck, and let me know if you’ve got any questions…

unrelated useless info: the bench cushion and the pillow in the forefront in these pictures was handmade by my lovely wife tara and the bench itself was built by yours truly. maybe i’ll start bluefurniture if i find enough spare time.


next up: white balance

nice aperture

ever wonder how professional photographers get shots with the person in focus and everything else encapsulated in a dreamy haze? it doesn’t require any photoshop magic, but rather just a little understanding of aperture.

there is a part of the lens that is close to the lens/body junction, basically a hole, whos function is to let light through when a photo is taken. the amount that the hole opens is called the aperture. aperture is measured on a numeric scale where a smaller the number indicates a larger opening on the lens. the aperture control can be thought of as your eyelid and how much you open it. the wider you open it, the more light comes in. when it’s real sunny, you squint. why? to reduce the amount of light intake. cameras are very similar except they can’t wear sunglasses.

in isolation, the aperture’s main influence on your photo will be the depth-of-field. here’s a couple examples to explain depth of field. this first picture has an aperture of 1.8 (referred to as f/1.8) while the second photo is taken at f20. the object focused on is the garlic clove (the white thing that looks like a small alien spacecraft) in both photos.

above picture at f/1.8

above picture at f/20

notice how at f1.8, the objects in front of and behind the alien spacecraft get blurrier the further away they are. at f20, everything in the picture is relatively clear even though the focus point in both photos was the same. that’s it. you want dreamy portraits? flip that camera into ‘aperture priority’ mode and start playing around with different aperture sizes. your camera will be gracious enough to adjust your shutter speed so you have perfect exposure every time. go crazy.

ummmmm, hang on. that’s not the whole story. let me tell you one more thing before you go open a photography business and put me out on the streets with your newfound knowledge. the other thing that aperture has a big influence on is how quickly it can get enough light into the sensor to correctly expose a photo. you can think of this like holding a poland springs bottle out in the rain as compared to holding a trash barrel out in the same rain storm. which will collect more water quicker? the trash can. that’s a larger aperture (remember – lower number, like our f/1.8 photo). ok, now you can go crazy.

here’s a quick summary of aperture:
* smaller aperture, bigger opening, more light, shallow depth-of-field *
* larger aperture, smaller opening, less light, deeper depth-of-field *

dead horse beating section below (reader caution – could be considered rambling beyond this point):
still not clear on aperture? man, photograph puns are horrible. anyway – hold your arm out in front of you and put up a few fingers. close one eye. you are now a camera. now let’s set you to a low aperture. open your eye all the way and focus right on your fingers. without shifting your focus from your finger, try to take note of how blurry the objects past your finger are. this is mildly difficult to do, but it is possible. in this mode, you’ve got a shallow depth-of-field. let’s switch you over to f/16. keeping focus on your finger and one eye closed, squint so your eye is barely open. you’ll notice that even though you’re focused on your finger, the objects in the background are nearly completely in focus also. you’ve got a deep depth-of-field, but your sacrifice is the amount of light you’re able to take in with your eye all squinted and bunched up. let that sink in, it’ll all make sense, i promise. don’t forget to open up your other eye before reading the rest.

each lens has its limit to the lowest you can go with the aperture. more expensive lenses can open up wider and are referred to as “fast lenses”. low-light situations such as traditional church wedding ceremonies require fast lenses. the rule for wedding lenses is usually no slower than f/2.8, but that’s going to hit your wallet pretty hard, take it from me.

next up: ISO

scrutinizing shutter speed

most people know that the shutter of the camera is what opens and closes when you take a picture. we’ll start with this component because it’s the easiest to understand of the three main contributors to your photo capture (the other two being ISO and aperture). shutter speed is measured in seconds, although more commonly in fractions of a second. the shutter speed can affect a few things in a photo, but in isolation its main impact is on the ‘fluidity’ of the picture. the easiest way to illustrate this is with some examples.

this first photo is taken with a shutter speed of 1/60 of a second, f/5, ISO 250, and with on-camera flash. for this post we’re only interested in the shutter speed parameter. notice how the water looks pretty sharp and ‘frozen in time’.

this second photo was taken of the same exact drip stream of water, but with a shutter speed of 1/30 of a second, f/11, ISO 1600, and no flash. the same flow of water now looks more fluid, almost like a complete stream. the use of the flash makes a difference, but the concept is illustrated the same.

what’s the deal? why does it do that? when the shutter snaps quickly (generally anything faster than 1/100 second) you typically get a clean picture that doesn’t have any blur. most things don’t move very much in 1/100 second. when you shoot with a shutter speed down in the range of about 1/30 second all the way down to multiple seconds, you are essentially taking a video that keeps overlapping onto the same photo. lots of things can move in a few seconds, even lazy people. as long as the shutter is open, it’s letting light in to hit the sensor and recording it.

“matt, my point-and-shoot camera pictures keep coming out blurry and i don’t know what’s going on, plus i don’t care about all this technical stuff. how do i fix it? ps – i love your blog.” well first off that’s very flattering. secondly, chances are that your friend, mr. shutter, is too slow. the simplest and most casual way to remedy that problem is by turning your flash on. this will allow your shutter to fire faster than 1/60 of a second and your blurs and ghosting should disappear. there are other ways, but they’ll be explained in a later post after we learn about a few other things.

due to a high risk of beating a dead horse by continuing this little shutter biography, i will end this post here. if you have any questions, feel free to leave’em in the comments and i will reply as quick as i can and try to help you out.

next up: aperture

Page 1 of 212