Education

We are committed to education. We like meeting deadlines and budgets. The more we and our partners share knowledge about their processes and needs, the more likely everyone involved will meet those deadlines and budgets. Our team has degrees in physics, engineering and music. We have experience in performing, recording and composing music; we have worked on scoring stages and on dub stages; we have designed and installed audio rooms for many of the major film studios. We have built lasers, computers and tube amplifiers (and that's just for fun!).

We feel our experiences make us well-suited to help you navigate the technological landscape we live in. Because of our combined experience, we like sharing our knowledge with our colleagues. From audio codecs to video delivery specifications, we will work hard to give you the tools you need to get your job done. We may not be the smartest guys in the room, but we'll do our best to deepen your knowledge where you need it.

Have a look around to see what we can do to help make your project and your process better, faster and more efficient. Please peruse our blog too, as we tend to publish more current and timely items there. And if you're curious about something you don't see here, please feel free to ask - we're here to help!


Black levels

Standard Definition Video

While working with video, we have to concern ourselves with its brightness and darkness output, and that depends on what is referred to as black level. There are two types of black that we typically work with. There is absolute black (or computer black), which we see here, and there is video black, which is seen here. It’s lighter in color, but it helps us to see much more detail in shadows and it helps to desaturate colors that might otherwise bloom out too much. We saw this effect far too frequently in the early days of music video.

This is because of the ways in which televisions handle video. Standard definition video must conform to the NTSC specification, which defines the range from black to white. The history of this range definition goes back to analog video (for the interested reader, a more detailed description can be found here). For standard definition digital video, the range is defined with 8 bits, and therefore has 256 possible values (ranging from 0 to 255).

The minimum brightness level defined in Standard Definition video corresponds with the 17th step, number 16. The maximum brightness level corresponds to step number 236, or number 235. This is why NTSC brightness is referred to as 16-235; it fits into a subset of the possible 8-bit range of brightness. We refer to this brightness range as the video’s luminance.

So how does this relate to the real world? The answer is that different devices and environments have different video output characteristics, and if the brightness levels of a video do not correspond to the target output device, severe degradation and loss of quality can result.

In this first example, we have a video that was designed on a computer, but is destined for DVD. A computer can take full advantage of the full 256-step range of Standard Def video. The DVD, however must output 16-235 video. If the video setup levels range from 0-255 on the computer and are not adjusted for the DVD’s range, we’ll see clipping in both the high and the low end of the brightness range.

In order to faithfully reproduce the original video, we need to adjust the video’s scale to the NTSC range of 16-235. This is why it is critical to have accurate color bars at the beginning of each video we produce. Without color bars (or with color bars that do not match the video), the black level and chroma setup of a video becomes a guessing game. For more about chroma, please refer to the article here.

Using the color bars at the beginning of a video in conjunction with the video content itself, we can adjust the scale of video output such that it fits in correctly with the correct NTSC color space. We do this electronically either while capturing a video source or before processing it for its final format. This final adjustment ensures that colors and black levels will appear accurately on the end-users playback system.

High Definition Video

HD video, 4k video and the higher frame size formats are different from SD video in numerous ways: their data rates are much higher, their display format is (nearly) always in widescreen mode, and the display resolution can vary from 480 x 854 to 720 x 1280 to 1080 x 1920, 4k, 8k and beyond. They also use a different luminance scale. UHD video goes even further with its color gamut, but you can read about that in the UHD section.

Because High Definition video doesn’t adhere to the older NTSC standard, it can take advantage of the full 8-bit range of luminance. This means that blacks can be blacker and whites can be brighter without taxing the playback device used to view the video. While color bars are still necessary, black and white levels tend to require minimal adjustment (and may not need adjusting at all).

What’s more, HD video has the capacity to handle 10-bit video, further enhancing the number of steps between absolute black and absolute white. This pushes the scale from 0-255 up to a healthy 0-1023 range. What this also means is that the data rate of HD video can be upwards of 440Mbits/s (or 880Mbits/s for 4:4:4 video).


UHD COLORSPACE

We have a new colorspace in our midst. Until recently, we've been working only with SD Rec. 601 and HD Rec. 709 video, and their 8-bit color and dynamic range. Concurrent with the advent of 4k and higher video frame sizes comes a new color gamut, called Rec. 2020. Rec. 2020 supports 10- and 12- bit video depth, meaning wider dynamic range, and a much wider color gamut.

The difference between Rec. 709 and Rec. 2020 (shown to the right) are that Rec. 2020 allows for a much broader amount of the human-visible spectrum to be observed.

By expanding the spectrum, we expose ourselves to frequencies that never before existed in broadcast video, meaning that converting from one colorspace to another is no easy feat.


MASTERS OF THE DEAD FORMATS

The audio industry has seen bucketloads of emerging technologies – most of which had the best of intentions behind them – bursting onto the marketplace only to fall with a muted thud. This page chronicles some of those technologies, many of which were pioneered or otherwise championed by Craigman Digital and its business partners.

In this sector, we refer to ourselves as Masters of the Dead Formats, and proudly so. Part of Craigman Digital’s mission is to facilitate the use of new technologies that enable our clients and business partners to deliver high quality audio and video products to consumers. When our clients request a solution to a problem, we respond with applications and products that fulfill the need.

When a product fails, it can be due to any combination of factors: lack of consumer interest, confusing marketing messages, competing formats and irregular industry support, to name a few. This section serves as a tribute and an obituary of sorts to those dearly departed formats.

8-Track Tape

The 8-Track was a continuous loop analog tape cassette. Most music fans over the age of 35 recall at least one older sibling or cousin with an 8-Track player in his or her car. The 8 tracks were actually 4 stereo tracks that played track 1 forward, track 2 backward, track 3 forward and track 4 backward again. Its semi-brief lifespan saw widespread acceptance among music companies, but lukewarm acceptance among consumers.

Time of Death: ca 1988

Factors leading to its demise:

You couldn’t record to 8-Track tape – all recordings were factory-made albums only.

Fixed tape lengths in manufacturing meant that some albums changed sides in the middle of a song.

Lost popularity to the smaller and more flexible cassette tape.

Cassette Tape

A very practical analog tape delivery system, the cassette ambled along at a blinding 3-3/4 inches per second. Though it was widely accepted as a home recording medium, it sustained limited support among record companies. Advances in tape metals helped to improve its stability over the years, yet with the advent of the digital Compact Disc, its days were numbered.

Time of Death: ca Late 2003

Factors leading to its demise:

Limited sound quality – tape hiss became a real problem when copies of copies of copies were made.

Bulky – while slimmer than its 8-Track cousin, a cassette collection took up lots of space and after a while became very, very heavy.

Technology – the cassette was no match for the fidelity of the compact disc.

Mini-Disc

The first foray into a lower-fidelity digital audio delivery system, the Mini-Disc tried to shrink down the footprint of its big brother, the Compact Disc. The smaller size and bitrate of this product were due to the use of the newly-created Adaptive TRansform Acoustic Coding, or ATRAC, a relative of AAC. While smaller in size, the codec was not nearly transparent enough to go undetectable to the ear. It came, it saw, it slowly slunk away.

Time of Death: Late 1990s

Factors leading to its demise:

Lower audio quality than CD.

Expensive first-to-market players.

No recording on first-to-market players.

Consumers didn’t want to have to pay for a new format that had no apparent advantage over the CD.

Compact Disc

Rumors of its demise are greatly exaggerated . . . thus far.

DVD-Audio

The first of our dead formats to receive the attention of Craigman Digital, DVD-Audio was first conceived at the same time as DVD-Video. Its specification included (optional) copy-protected high-resolution audio in both stereo and surround sound, browsable menus displayed during music playback, video playback, and full backward compatibility with existing DVD-Video players.

While you could play DVD-Audio in your regular DVD player, you had to upgrade to a new DVD-A-compatible player to enjoy the full resolution playback that the format offered. While not an easy format to create, DVD-A is very easy to operate, and the sound is unparalleled. Supporting PCM bitdepths up to 24-bit and sample rates of up to 192kHz (stereo – surround sound maxes out at 96kHz/24-bit because of DVD lasers’ bandwidth limitations), this format delivered precisely what it promised – the best high-resolution surround-sound audio experience consumers could buy.

But Sony thought different.

While DVD-Video was released en masse in 1996, DVD-Audio took on costly delays. Disagreements among music industry leaders in effective copy protection measures along with competing technical camps (mostly centered around lossless audio compression) meant that the format would not see the light of day until 4 years after its video counterpart was released to the public.

During this time, Sony and Phillips teamed up to tout another high-resolution audio format, the Super Audio Compact Disc (SACD). This format also promised high-resolution surround-sound audio. Both formats got press, both got accolades, and they both got the thumbs down from impatient consumers as many news articles came out comparing the two to Beta vs VHS.

Craigman Digital helped the DVD Forum develop and realize some of the technical implementations behind this format. President Craig Anderson spent four years speaking on panels, contributing to white papers, writing articles and lecturing on the benefits of DVD-Audio. He frequently went toe-to-toe with SACD proponents, fighting the good fight in that format war.

Though DVD-A still enjoys a fair amount of attention in some limited European markets (classical recordings, mostly), the format has not seen wide release in the US since 2008.

Time of Death: November 20, 2008 (US)

Factors leading to its demise:

A multi-million dollar format war.

Ongoing audiophile-induced controversy over its optional watermark feature.

Confusing marketing message led to brick-and-mortar stores’ inability to place product in the proper location.

Consumer shift toward low-resolution downloadable music.

Super Audio Compact Disc (SACD)

Super Audio Compact Disc was the cooperative brainchild of Sony and Phillips. Instead of digital PCM (Pulse-Code Modulated) audio, SACD used Direct-Stream Digital, a format conceived in the 1980s by Sony, and intended as an archiving standard. DSD employs 1-bit sampling at the staggering rate of 2.6GHz. SACD listeners frequently describe a euphonic quality to its audio that is difficult, if not impossible, to describe.

SACDs cannot be played in standard DVD players (the format’s laser uses a different wavelength than DVD). It can, however, be played in most CD players. The clever trick in this is that Sony/Phillips found out how to manufacture a dual-layer disc with an SACD on one layer and a standard Compact Disc on the other. Put the disc into a regular CD player, and it plays the regular CD layer. In an SACD player, you get the high-resolution experience.

The format war brought about between Sony/Phillips and DVD-Audio began to intensify over several fronts and accusations (for most of which the average consumer couldn’t care less): DVD-Audio plays video. SACD promised motion video on its titles (and never delivered). DVD-Audio requires brickwall filtering in order to convert audio streams (it never has). SACD’s euphonic quality is created by the enormous amounts of noise inherent in DSD that must be shifted out of the audible band and into the supersonic (but they named it Super Noise Shaping, so that’s alright, then).

In the end, Sony/Phillips threw in the towel. DVD-Audio won the battle, but neither format won the war. To the tune of countless millions.

Time of Death: August 2006

Factors leading to its demise:

A multi-million dollar format war.

Recognition by Sony/Phillips execs that a stalemate was better than gambling countless millions more on the dwindling prospect of a per-disc license in an ever-growing downloading environment.

Consumer shift toward low-resolution downloadable music.

DualDisc

In 2005, music industry giants, some licking their wounds from the DVD-A/SACD format war came together to find a solution to the problem caused by the disparity in disc-based music formats. Consumer complaints over too many players to buy to accommodate too many discs were heard, and an industry-wide solution was proposed. Its name was DualDisc.

DualDisc took a cue from SACD in that it was a dual-format disc (except that it had two layers on two sides instead of one). Put it in a CD player – it plays like a CD. Put it in a DVD player – it plays like a DVD. The format was compatible with DVD-Video and DVD-Audio, and it played in CD players, too. Win-win!

Unless you had a slot-loading player.

Testing reports began to point out a flaw in the construction of this clever little disc: it was too fat. You see, a CD’s laser requires a little more distance to reach its correct focal length, and that means thicker plastic. The CD layer has to be about as thick as a regular CD, but it also has to be glued to the back of a DVD. That made it about 30% fatter than either the CD or DVD specification would allow. That made it start jamming in automobile players and in slot-loading (aka “toaster style”) computer CD drives, such as Apple’s.

Further troubles worried industry giants when the DVD Forum came out and declared that the DVD logo could not be used on the DualDisc because its thickness was out of spec. Similar problems prevented the Compact Disc logo from appearing on the packaging. It died a very fast death.

Factors leading to its demise:

It jammed players.

It was forbidden to use either the DVD or the CD logo, so no one knew what the heck it really was.

Marketing tactics bumbled this message, and it was further crippled by a really, really, really, really silly modified jewel case.

UMD

In 2005, we were approached by a music label to see what we could do with a PSP. Specifically, the studio was looking for a way to replace DVD with the UMD, Sony’s proprietary specification for media delivered on disc via the PSP. Email signatures from Sony began including “The newly developed Universal Media Disc (UMD) is the next generation portable format for movies and other forms of digital content.” Clearly they wanted in on the iPod’s action. They never got there.

Its authoring interface was no interface at all. In fact, it was simply java and markup and no GUI. The specification was unforgiving, and video was only 272 pixels wide. Craigman Digital procured a PSP and some sample titles while testing out the h.264 encoder. After a few months of testing and scripting, we flatly said, “um, no.” And the rest was history.

Time of Death: Early to mid 2006 (though Sony won’t declare it dead yet)

Factors contributing to its demise:

Impossible to author – hardware rendering farms would compress video at over 24x real time. No authoring interface.

Disc cases kept breaking – consumer complaints about the disc’s protective case not being so protective were well-founded. They filled with dust and other nasties.

Craigman Digital refuses to support the format, leading to its immediate decline.

MVI

Also known as IMA (Interactive Music Album), the MVI (Music • Video • Interactive) was a brilliant crossover offering that went nowhere fast. The Warner Music Group, intent on offering an alternative music product that would bridge the difference between CD and DVD (and mindful of the debacle of the DualDisc effort) tuned to another hybrid offering that would give consumers access to music, videos and pre-prepared MP3s for immediate download.

When WMG turned to Craigman for a solution, we pushed the envelope.

In the winter of 2006, Craig and Dave looked at the landscape of the digital world. Realizing that music alone could not sustain the industry, they looked to two emerging video formats, HD-DVD and Blu-Ray. Looking in that direction, we upped the ante, forming a partnership with a company from the web technology sectors. Utilizing HD-DVD’s ability to connect online content to a disc-based product, we created the Interactive Music Album, a DVD-based product that included a media player that integrated both on-disc and online content.

Registered users would open the player on their PC to play songs and videos, and to access bonus features, such as IM icons, desktop wallpaper, screensavers and other applications. Once online, the MVI could reach out to the internet and download new and updated content to the user’s computer for integration into the MVI, giving the appearance that the new content was downloaded to the DVD.

The MVI offered users continuous streams of new content over an extended (and potentially infinite) period of time, and it offered labels unprecedented access to users’ online activities, which could be leveraged in the form of sharper marketing strategies and customer feedback. The platform could even be shipped with no on-disc content whatsoever, and an entire product could be packaged after the disc was manufactured and shipped, severely reducing production time.

Unfortunately, though the MVI was a software product, it was not treated as such by production teams. The production process in the music industry is always an accelerated workflow, meaning that a project begins with the artist and ends with production and the workpace quickens to a rapid pace as the delivery date approaches. It is a freight train with a huge head of steam.

The combination of software development and its required testing and QA component increased authoring timelines from two weeks to six, a necessary requirement when working with a software product that integrates hardware, multiple computer platforms, content hosting environments, CMS environments, security requirements and disc-based content. Labels could not afford the time it took to create the products in the required development cycle, and they could not afford the resources to begin building titles themselves (though a build-it-yourself application was provided).

The daunting task of amassing all of the content that MVI required and then waiting over two months for the product to hit the streets was too big a burden for labels to take on, and with a collective sigh of relief from product managers, MVIs began dropping off of production schedules. By February 2008, all orders had ceased.

Time of Death: February 11, 2008

Factors contributing to its demise:

Product rushed to launch – development was not complete at the time of its maiden voyage. Music industry not prepared for the burdens of a regularly released software product.

Slow software launch – because of labels’ product requirements, Macromedia’s Director was a component to the MVI. Director could possibly be the slowest software in the world. Consumers hated the wait time.

Very confusing production process – suddenly injecting software terms such as client, server, registration and XML into the production process proved extremely overwhelming to those unfamiliar with the terms.

New promises on the horizon – hints about a faster alternative to the MVI encouraged labels to resist the MVI platform and wait for another better, faster one. Which never came.

HD-DVD

The most recent format to bite the dust is HD-DVD. As with DVD-Audio and SACD, the advent of a high resolution video product was eagerly anticipated by audio and videophiles alike, boasting a 1080p resolution, lossless surround-sound audio, picture-in-picture capability, non-invasive menus, and internet connectivity to enhance the viewing experience. The world had gotten a taste of high definition video in the form of HD cable and television broadcasting, and it was a very exciting breakthrough for a standard that had been around since 1928, and had seen no significant technological advances since 1954.

HD-DVDs stored 15GB per layer (compared to DVD’s 4.3GB), used more efficient HD codecs, and could be manufactured with a minimum 3 layers of data, thereby making a 45GB behemoth of storage. Its authoring platform was based on a combination of ECMA, a subset of Javascript, and XML, requiring a whole new skillset out of any authoring facility. Web developers quickly found themselves in high demand.

Microsoft quickly threw itself behind the format, releasing scripting tools, video encoding software and an HD-DVD emulator to test the menuing applications that would go on disc. Toshiba organized a set of authoring goals in 2006 called compilations. These were little software exercises designed to quickly overcome some of the common technical hurdles that nearly every authoring facility would need in its arsenal at some point in time. Craigman Digital and its partners created two of these compilations for Toshiba.

Not unseen this time, a competing format, Blu-Ray, loomed over the horizon. Blu-Ray boasted nearly the exact same specifications as HD-DVD with some relatively minor deviations: a different, yet still good video codec, java-based authoring as opposed to javascript (yes, they share little more in common than a syllable), and internet capability. But the big difference was the size.

Each layer on a Blu-Ray disc holds up to 25GB of data, 60% more than an HD-DVD. This got computer manufacturers and gaming companies lined right up behind BD. They didn’t care about superior codecs, they didn’t care about audio quality, they cared nothing for internet connectivity. They simply believed in the bigger bucket.

Outside the hardware community, other lines were being drawn, with Sony leading the Blu-Ray charge and Microsoft/Toshiba working feverishly for HD-DVD. Studios split on the issue, some straddling both sides of the fence. Music companies were staying away – far, far away – from investing heavily in either format (though some titles were made in both formats, like Nine Inch Nails’ Beside You In Time).

The format war dragged on through most of 2007, as Sony threw fistfuls of money at Blu-Ray marketing during consumer and professional tradeshows alike. Then right around Christmastime the rumors started. They started in low, and began to grow.

At CES in January 2008, Warner Brothers Studios announced that they were done waffling, and would cease to release any more titles on HD-DVD. Citing the inability for the film industry itself to survive another format war, the giant studio shrugged and put out the flame on HD-DVD in a single swift snuff.

While it may be true that the industry could not survive another format war, and it is very likely a smart decision on the studio’s part to halve its production costs on already costly products, they killed the wrong format. HD-DVD, while slow to load, was actually full-featured out of the box. It used a better video codec (at the time), and its internet capabilities were working from day one. Blu-Ray went through a two-step release program, and has only just recently achieved velocity with its networking abilities. HD-DVD, with the support offered by Microsoft and Toshiba, were extremely easy and much cheaper to author.

Time of Death: January 5, 2008

Factors contributing to its demise:

Yet another costly format war.

Hardware manufacturers’ preference of Blu-Ray’s 25GB layers over HD-DVD’s 15 (it didn’t matter. Blu-Ray could only make dual layer discs at the time, while Toshiba was prototyping a 5-layer disc. Do the math.)

Discontinued support from Warner Bros. Studios yanked the rug out from under its feet.

By expanding the spectrum, we expose ourselves to frequencies that never before existed in broadcast video, meaning that converting from one colorspace to another is no easy feat.


PCM vs DSD

I was at a Hannover tradeshow, perusing a Phillips Electronics brochure, curiously titled “SACD – Why do we need it?” The event was the Tonemeistertagung, and the brochure was a message to the consumer explaining why SACD is better than CD or DVD-Audio. While this particular booklet was quite benign in its assertions, there was one particular page that caught my attention. On the page was a depiction of a 10kHz square wave and two other pictures, one a graph of the square wave processed by PCM, and the other processed by DSD. The graphic showed that a 10kHz square wave, when passed through a DSD analog-to-digital-to-analog chain, looks much more like the original square wave than the same signal passed through a PCM A-D-A process. In fact, the PCM signal came out looking like a 10kHz sine wave! Was this right?

I was curious about two things. The output of the PCM converter seemed way too sinusoidal, especially to someone like myself who has been involved with high-resolution digital audio for so long. I knew that at lower sampling rates a 10kHz square wave would look pretty shabby, but the sine wave aroused some suspicions. This particular page was, after all, designed to show that SACD sounds better than DVD-Audio. The other thing that raised my eyebrow was the cleanliness of the DSD graph. It looked nearly identical to the square wave – not nearly shabby enough, I thought.

When I got back home, I thought I’d try out the comparison first hand, just to check Phillips’ homework. The following illustrations are well-known to some, and the ensuing analysis has been discussed at great length by those more entwined in digital audio processing than myself. I simply found the assertions in the brochure to be quite bold, and worth looking over. I used an analog 10kHz square wave generator, a dCS A/D converter, a dCS D/A converter, a good old-fashioned analog oscilloscope, and a digital camera. I ran the signal through directly to the scope (Fig. 1), then through the converters using 44.1kHz PCM, 96kHz PCM, 192kHz PCM, and finally, DSD.

Figure 1 10kHz square wave, undigitized

Sure enough, the first PCM pass output a sine wave (Fig.2). This is not too surprising, considering that at 10kHz, a CD takes only 4.4 samples per cycle. Filtering and interpolation smooth out what would otherwise be a rather jagged signal. Note the smoothness of the trace on the scope. But since the pamphlet mentioned DVD-Audio, I proceeded with higher sampling rates.

Figure 2 10kHz square wave sampled at 44.1kHz PCM

The 96kHz trace (Fig. 3) improved in two ways: the rise time of the tone improved greatly, which resulted in a sustained peak duration. This was beginning to look like a square wave. In fact, this picture begins to reveal more about square waves than most people care to know. Square waves are produced by adding a single sine wave to its odd harmonics (¥ +2¥ + 4¥ +…). That downward bump at the peak’s extremities is the first harmonic coming out of the signal.

Figure 3 10kHz square wave sampled at 96kHz PCM

At 192kHz (Fig. 4), the trace improves even more, halving again the rise time, and revealing the third harmonic of the square wave. The signal is still very crisp and consistent, and the difference in rise time between it and DSD are statistically (and perhaps audibly) insignificant.

Figure 4 10kHz square wave sampled at 192kHz PCM

The DSD trace (Fig. 5) represents the square wave better than the first two PCM traces, and it does look very similar to the picture in the Phillips brochure (except that it’s blurry). Like the 192kHz photo, The DSD signal begins to extract the first and third harmonics from the square wave. But DSD’s accuracy comes with a cost, which is not discussed by proponents of SACD: loads of noise.

Figure 5 10kHz square wave sampled with DSD

The noise created by Direct Stream Digital is tremendous, so tremendous, in fact, that Sony/Phillips have created a noise-shaping system designed solely for the purpose of disguising the inherent noise in a DSD signal. Explained briefly, the noise created by DSD’s one-bit sampling is shifted out of the lower frequencies, and shoved up into the ultrasonic range, thereby making the noise “inaudible.” We can see that the system is not quite perfect, as the 10kHz signal is still tainted by noise. But this noise is not the only fly in the ointment.

The blurring in Figure 5 is caused by imprecise traces along the vertical and horizontal axes, which are much more significant than the noise superimposed on the traces themselves. These imperfections in the DSD signal are, respectively, amplitude imperfections and time domain imperfections. Were one to zoom in on the DSD signal, one would actually see amplitude fluctuations of 50% peak amplitude, and time domain errors similar to the 96kHz rise time deviation. The defect, when compared with the PCM photos, illustrates perfectly the reason that DSD is incapable of reproducing the same transient twice. Note again the precision with which PCM represents the signal. If DSD cannot identically represent a simple square wave over a very short period of time (as compared to the PCM models), the time domain errors caused by DSD sampling are too great to precisely and accurately (remember those terms from first year physics?) reproduce a sound.

What conclusions can be drawn from this photo gallery? Well, one must certainly point out that a 10kHz square wave does not make for a very memorable listening experience. However, it does help dispel the myth that DSD’s one-bit sampling is the panacea to the world of digital audio. We can clearly see that with this particular waveform, PCM produces a much more faithful copy of the original with both accuracy and precision.

Now, at the upper end of the audible spectrum, we toy with the age-old (and sometimes annoying) digital question: can humans really perceive sound above 22kHz? I am of the opinion that ultrasonic harmonics make a difference. Otherwise, I’d be working at 24-bit 44.1kHz, and not bothering to open this discussion. The significance of the preceding graphs is certainly in the ultrasonic. Remember the noise-shaping issue with DSD? Well, the noise in a DSD signal increases dramatically as the frequency increases. In fact, DSD’s noise level can be up to –40dB in the ultrasonic range1. 24-bit PCM has a consistent noise level of –144dB across all frequencies. This means that DSD’s ultrasonic characteristics are tainted with noise as the listening frequency goes up. How can warmth and harmonics be reproduced in such a maelstrom? It’s like listening to an ultrasonic cassette. This is, by the way, why most SACD players are made with a built-in 50kHz rolloff filter. Though Sony and Phillips tout a 100kHz frequency range, the spectrum must be halved at the player’s output. Since such a barrage of ultrasonic energy tends to fry tweeters, the rolloff is necessary to protect equipment incapable of handling this sonic assault. So much for the efficiency of one-bit systems.

My most significant conclusion points simply to more questions: How does an engineer work with DSD? Do you roll the room off at 50kHz or leave it at 100k? Where is the peace of mind that your work will be accurately reproduced once it leaves your studio? SACD – Why do we need it?

1Ingvar Öhman, interviewed by Niklas Ladberg, Elliott Sound Products