If you haven’t noticed, the days are becoming shorter, it’s not as warm out and—depending on where you are—the trees probably have changed color. Welcome to winter, sometimes a seasonal smorgasbord of aviation weather conditions. May we interest you in some unexpected fog? Perhaps a premature sunset? How about a nice mix of solid overcast, low sun angle and variable winds?
If you’re a fair-weather pilot, you may be tempted to hang up the headset for a few months, at least until it’s time to adjust the clocks forward. For the most part, that’s the wrong reaction, but it’s clear the short days, low sun angles and long shadows, and freezing precipitation winter brings can tax our individual systems. Fortunately, advances in several areas mean we have some technology-based hedges against the season. There being no free lunch, these enhancements also come with their own shortcomings, and it’s important to know what they are.
Eye Can See Clearly…
For the safest, sanest winter flying—instrument or visual—we really need to know and understand our own limitations, how to counter them with technology and where those technologies can fail. And we should never forget counting on these tools—for that matter, any tool—to perform as you hope can be risky, even in severe-clear conditions.
One example is the proliferation in synthetic or enhanced vision systems. An enhanced vision system (EVS) generally is based on an infrared camera mounted somewhere on the airplane, feeding video to the instrument panel, preferably overlaying a moving map. Think of it as a civilian version of FLIR—forward looking infrared—in wide use with military and search-and-rescue operators, like the U.S. Coast Guard and EMS helicopters.
Infrared sensors “see” in a fairly narrow spectrum of infrared light— “heat” some would say. The sensors use light in a spectrum the unaided human eye can’t see and can show wildlife, people, vehicles, even the marks and numbers painted on runways. It does so by comparing the infrared light given off by various objects and providing contrast on the display screen used to deliver the image captured by the sensor.
Sounds great, right? You bet—and it’s an excellent tool, as long as you understand its shortcomings. For example and depending on its design, an EVS sensor may not do a good job “seeing” through clouds or blowing dust, at least not as well if the obscuring phenomena didn’t exist. They do well enough in low-visibility conditions, though, that the FAA allows a lower DH on instrument approaches for users of approved EVS systems. One hundred feet lower.
One “gotcha” you may want to be aware of, though: They won’t automatically “see” energy given off by light-emitting diodes, LEDs, a lighting technology enjoying rapid acceptance thanks to its low energy consumption and long life. And wouldn’t you know it: LED lighting is becoming popular in illuminating runways and taxiways. But for EVS systems to detect IR from an LED-based lighting system, the LEDs must include an IR source in addition to the equipment producing visible light for our eyes.
The trick is LEDs produce light only in the frequency spectrum for which they’re designed; they don’t naturally emit IR like other sources emitting energy all over the spectrum, like incandescent, halogen or fluorescent.
The EVS technology is an excellent year-round tool, but to avoid a trap it’s worth checking destination airports to learn whether they’ve upgraded to LEDs. If so, ask if the landing aids been equipped with IR-spectrum LEDs.
SVS: Current DATA is Key
Another excellent aid to low-visibility flying, the synthetic vision system (SVS) also is spreading throughout general aviation. The technology is available for some certificated systems as a software upgrade for as little as $2500, installed. Of course, it’s also growing in popularity among those able to fly with experimental avionics.
The latest SVS employs a high-resolution database which uses input from a GPS to display an accurate, near-photographic representation of the world outside—along with the correct direction, angle and velocity for your GPS-determined position and relative movement. Some who’ve sampled this technology comment about how much it reminds them of a high-end video game—except with real sound from a real airplane moving over real territory.
Two other significant differences exist between the real-world SVS and video games, however. First, if you fly the video game into rising terrain, get trapped up a video-generated canyon or come up short on approach, the cracked structure is synthetic and the crash temporary. In the real world, it’s a real accident, with real consequences.
Second, the video game can only challenge you with obstacles programmed into its memory. Likewise, the airplane-mounted SVS can show you only what’s in its database. So a new cellular antenna or building erected since the last data update won’t show. Neither will any portable object, like a vehicle stopped on the taxiway or the wild animal crossing the runway in front of you (although an EVS can depict these objects).
In-flight, SVS is designed as an aid in low-visibility conditions: You can see what’s below you even when in or above an overcast. It’s not designed to help you fly through canyons or to try to land below minimums. And even when updated annually, as recommended as a minimum by virtually every vendor, you should still give yourself a break and look at the charts with an eye toward obstacle markings that made the chart cycle, if not the database cycle.
So remember: even if the SVS in your panel or on your tablet has the newest database available, even if it’s proven to match up the runway depiction with your actual position, it will never, ever, actually “see” anything it hasn’t previously memorized.
When you Need a Weatherman
Another “beware” technology with major implications for winter flying is weather datalink. You see, depending on the hardware and delivery system, what you see may not be what you get. Put another way and as we’ve seen a number of times, what we saw was not what we were getting. And what we were getting wasn’t friendly. The problem we encountered involved consumer-grade products used in the aviation environment, but even equipment designed and produced with the cockpit in mind can exhibit unwanted behavior.
What we’ve seen involves displacement of storm graphics by mid-double-digit distances on consumer-grade tablets using an external GPS source to determine position and a different software package to show the weather graphics from the datalink service. The result was storm images displaced far beyond the latency issues already known.
The systems showed our location as clear with a severe storm passing some 40 to 50 miles away; aviation datalink services played on aviation-specific products and portable GPS receivers compared at the same time showed only the displacement of a few miles—as you’d expect when comparing an eyeball view with the six- to 10-minute-old images delivered by the datalink.
Is it a problem with the device knowing accurately where it is? A problem with how the software overlays the base map with the image? Or a problem of how the device itself tries to combine the imagery? In asking a dozen different avionics and software vendors, the answers held no consistency—except for the common admonition. As always, those images should be considered only for advance planning: We’ve never recommended using them to navigate actual weather—even when backed up by a sferics device.
For our purposes, the advice we think you’ll take is this: get to know how your portable datalink-weather receiver works around real weather before using it in the cockpit.
TAWS: Another Database
Terrain awareness and warning systems—TAWS—present another database trap, one which pilots need to understand regardless of weather or time of year but which becomes particularly critical to understand when weather conditions go south.
Various classes of TAWS exist to fulfill varying degrees of regulatory requirements and aircraft needs. Like SVS, however, they all employ a very large database along with processing capability that compares GPS-generated position, altitude, track and speed data against the terrain and man-made concerns such as buildings, radio towers, television towers, bridge towers—even the harvesting machinery of a relatively new crop, electricity. Wind farms can cover thousands of acres with structures reaching higher than all but tall transmitter towers. We’re talking 500-plus feet in some instances.
So check that database on your TAWS, mini-TAWS, even your GPS navigator, since its top-down terrain-and-obstacle warning graphics also depend on a current database.
EFB: Learn First, Fly Second
Lots of aviators have moved to electronic flight bags in recent years, thanks in part to the iPad’s popularity as an all-around portable computing tool with long battery life and a sharp screen. New and exciting as they are, EFBs—no matter the hardware platform—have already generated reports to NASA’s Aviation Safety Reporting Service and shown up in NTSB and FAA accident summaries. The issues, typically, involved pilots missing check points, busting altitudes or overflying a reporting point.
The reasons typically involve issues with using the digital chart functions, with panning, zooming and paging frequently mentioned as issues. Many problems boil down to fixation—the pilots became so fixated on trying to get the device to give them what they wanted that they lost all sense of their other flight responsibilities. Not a good thing.
One solution: Learn how to turn on, turn off, find, select, change, pan, zoom, shrink, move from maps to approach plates to SIDS and STARS in the comfort of your home or office. Then get in the airplane and do it all over again. And—despite improving battery life among all portable products—make sure you have a spare battery or a way to plug in to ship’s power, even when you think you know how long it will run. Or, just keep the paper handy.