Monday, September 25, 2006
How "Digital" Has Destroyed Music
The beginning of it all was when Sony came out with Dolby Digital sound in the movie theaters. It was a worthy prospect. Abandoning analog sound technology certainly would make things easier for the production studios. Imagine carrying around a whole movie score on a portable hard drive. It began with Apple's Quicktime software. For the first time film composers, with the right equipment, could import a film into the digital domain, i.e. as a Quicktime file, and sync that to their music production software. When they played the MPEG file in a little TV-like window on their Mac, it would sync to their sequencing program. What is a sequencer? A sequencer is an indispensable piece of computer software that allows musicians to record digital MIDI data into a recorder-like interface. Basically that means you can play music on a keyboard or drum pads that transmit MIDI data to the computer in real time. You set up a time signature with a click and play the music. Then it is digitally recorded as data, not sound. In an editing window you can fix the wrong notes, timing or performance errors, and add expression such as dynamics or phrasing. Good composers play the music live. In essence the Macintosh became a glorified recording studio. As time went on engineers refined the software. As hard drives sped up and became more efficient, and analog to digital converters became less expensive, they created an interface that allowed you to record "live" sound to the hard disk in sync with MIDI data. That way keyboard sounds and digital drum samples could play in real time with the recorded audio. Every time the sequencer played, the sound quality was first generation. Eventually you would mix the composite recording to a mastering recording deck, such as DAT or analog reel to reel. MIDI was the real beginning of digital music. Musical Instrument Digital Interface basically took the notes you played on a keyboard and turned them into digital controllers, so you could also trigger other keyboards or modules. Huge layered sounds were now possible when playing live. Live bands picked up on this quickly, and MIDI rigs became crucial to the success of live musical acts. Keyboard magazine documented many of the "rigs" musicians were designing and playing on their gigs. For the first time a production oriented sound was available live. Keyboard players in bands could conceivably be virtual orchestras playing string samples, horn solos, and bass lines. Down the line the Hip/Hop faction utilized this with their Akai MC series drum controller. They played the "beats" live on rubber pads giving the music a "groovy" feel along with a unique sound. The Hip/Hop sound became a low, analog bass sound with a high, "chinky" high hat from a synthesized drum kit. In their world "Lo-Fi" became "Hi-Fi." This early gritty sound had character because of its primitive technology. Low resolution and bit rate and grainy filters provided a very analog-like sound that was appealing to many. As digital technology progressed, resolutions and bit rates became higher smoothing out the sound. Eventually "virtual instrument plug-ins," software digital representations of analog instrument, became available for use in sequencers. Imagine again abandoning your home studio made of heavy, steel, MIDI tone modules for the convenience of instruments that live on your hard drive. Sounds good, huh? The problem I see with all of this convenience is, when your whole studio begins to reside in a box, from where will your inspiration come? All of this digitization has had a lasting effect on the music industry. When one-man studios can hawk their inexpensive production techniques, music becomes secondary. Hype has taken over much like the TV show American Idol, and media tries to sell their cut rate product by aggressive marketing techniques. Television right now is the epitome of this flawed philosophy. Superficial, shock-effect, immediate gratification pitching has marred the communications industry. A once well respected and honorable form of news and entertainment has fallen from grace. Specifically how has this effected music? I have one answer. The digital domain has become high resolution. HDTV, plasma displays, and other gimmicks have flooded the market. The soul, artistry, integrity, or "analog" of the PRODUCT has been lost. Without this all the technological advancement in the world will not mean anything. When sequencers first became available to musicians, they were analog. Oberheim and Moog both offered analog sequencing devices that recorded musical events. The resolution of these devices was low. That meant the "timing" of the music was loose and felt human. As digital sequencers developed, new beat plug-ins became available that quantized MIDI events in various ways to simulate live performance. Engineers got "jiggy" and began marketing different rhythmic plug ins that effected the music. No longer did a live band have to generate a "groove" or "feel" in the studio to define a particular style of music. These plug ins attempted to recreate the human feels of musicians in the digital domain. In certain ways these "beats" have been helpful in the evolution of electronic music. Genre's such as Trance, House, and Acid Jazz may not have existed at all if it hadn't been for these tools. Taken out of perspective and put into the hands of a corporation, these short cuts in music production seem to have eliminated the musicians themselves. Once bands had to travel and record for months to develop their music into a product worthy of consumption. Now music in media has been reduced to shallow, trite, and disposable product. One bad example of the utilization of this technology for corporate gain is in the cruise ship industry. Showbands on various cruise ships are supposed to play live music for their production shows. Because musicians of this caliber are not readily available, cruise lines require the arranger of the show to provide a recorded sound track that acts as a security blanket for the ship. If and when the hired Showband can not play the show, the tracks can be played for a successful experience. On most ships they end up running these tracks all the time, because they include instruments that are not available in the live orchestra. A string section, french horns, backing vocals, and ethnic instruments are often added to a mix of a live band. Unfortunately this soundtrack is generated by a computer sequencer like the one I have been describing. When a composer does not have the time or resources to hire live players, they often generate a MIDI sequence from their notation program. They will arrange the music in a program such as Finale or Sibelius and export a MIDI file to Digital Performer or Logic. The result is a Casio Drum machine-sounding file that sounds like a machine. There is no human quality or expression in the performance of the music. Over time and because of necessity, arrangers have continued to do this forcing their live musicians to create a new "style" of music to play with the track. This strictly quantized feel has become the norm for cruise ship Showbands. For some reasons it seems many corporate executives took a fancy to this kind of music. Gloria Estefan used a sequencer like this in her song "Conga," and it had an appealing result. Whereas certain styles of commercial music can make use this quantized feel, American jazz for one can not. Many styles of music including European orchestra and chamber music do not use strictly quantized quarter notes. Early in academic music school fledgling musicians are taught to play slightly ahead of the pulse to give music a forward momentum. This positive and energetic feel is what creates true excitement. In other styles such as Rhythm and Blues, musicians are taught to lay slightly behind the beat to create musical tension. It is very simple to define a transcendent feel in music. Rarely in any kind of performed music do all the players play in strict metronomic rhythm. There is give and take in phrases, and this is what lets the music breathe. The best conductors are experts at interpreting like this. They spend countless hours rehearsing groups to achieve this kind of musicality. No where anywhere is it written, "All musicians should play exactly together like a machine." For cruise lines to allow this poor level of product from their arrangers is heinous. The neurosis that is created by hiring competent musicians, and then asking them to play like a machine is absurd. Only if the respective Musical Director is keen enough to remember the difference between "real" music performance and ship music will the band be able to transcend those musical barriers. The definition of a "pocket" in music feel is when a drummer or piano player can play slightly ahead of the beat with one extremity and slightly behind with the other. Earoll Garner did this first on the piano on his recording Concert By the Sea. In the jazz idiom a trap drummer will play ahead of the beat on the ride cymbal while playing behind the beat with his high hat. This creates a "pocket" or space for the music to breathe. In jazz the music swings over the top of slightly pushed ahead bass notes using a triplet sub-division. Only on ships are musicians brain-washed to think the dotted eighth sixteenth rhythm will substitute. Likewise the ebullient movement of only eighth notes undulating around this forward pushing beat can provide the traditional jazz feel most aficionados appreciate. To sub-divide every style of music into sixteenth notes or higher defeats the purpose of the music itself. It has only been with the development of the software digital sequencer that this high resolution has confused and tarnished what once was a human aesthetic.