A long time ago, I began to take a serious interest in music; that is, listening to it, singing along with it, and indulging in fantasies about being a musician. I was about 8 years old when Nickelodeon started running old episodes of The Monkees (a sitcom featuring a prefabricated band about the misadventures of a fictional version of said prefabricated band), and I just loved it. I didn’t own any Monkees records, and had no means of going and purchasing any, and it never really occurred to me that one could buy a Monkees record; I think I assumed they simply didn’t exist for some reason. I mean, I was 8.
So to satisfy this need, I took what was already at the time a gravely outdated cassette recorder, and when The Monkees aired on cable, I held the device up to the TV and hit record when a song played on the show. I did this over the course of a few episodes and soon had a decent collection of Monkees songs on my crappy little tape deck with a single mono speaker—not that this mattered, because of course I was recording from the TV’s speaker. I listened to it all the time.
When my dad realized I had been doing this, he said, “You know, we can buy you a Monkees tape.” That was really nice of him, but in my 8-year-old brain, it was kind of silly. I already had all the songs, dad!
I did eventually acquire The Monkees’ Greatest Hits on audio cassette, and I was very glad to have it. The songs were way longer on the album than on the TV show! The theme song even had a second verse! And there were a bunch of songs that weren’t even on the show! And wow, “Listen to the Band,” that one was weird!
While I also noticed that I could hear more of the instrumentation on the album than on my tape of TV audio, it wasn’t all that much different to me. I mean, it was still the same crummy old tape deck with the mono speaker. It didn’t seem crummy at all to me, as I had little to compare it to. So it was fine! When I eventually got other tapes—“Weird Al” Yankovic, the soundtracks to Ghostbusters and Transformers: The Movie, and so on—it was the same. I was just happy to have them.
I wasn’t done recording music straight from the TV, either. Over the next couple of years I would also start recording music from my video games (good lord did I love the music from Mega Man II), other kinds of music I was growing to like from MTV and VH1, and even sketches from Nick at Nite’s reruns of SCTV. These were the crappiest possible ways to get this kind of audio, but I didn’t know, and therefore didn’t care. I could now listen to those SCTV sketches over and over, memorizing them without intending to, and reenacting them on my own recordings. And I could listen to my Phil Collins and Genesis songs all I wanted, pulled right from the tinny sound in my room coming out of my tiny TV set.
At some point, I was given a portable cassette player (what I would call a Walkman, but of course wasn’t). I’d still listen to the same stuff, poorly recorded by me, totally unaware that “audio production” was a thing that mattered a lot to people who made and listened to music.
And yes, my family thought it was very weird. Especially the TV-recorded video game music. My grandfather couldn’t even fathom it. My dad summarily dismissed the music itself as “probably written by a computer.” Whatever, dad! You’ll never understand me!
As my music tastes matured, I started digging through my dad’s collection of tapes, most of which were themselves copies from vinyl LPs. Dad, himself a savant of a musician, had an extensive (and today, one would definitely say quality) record collection, and I began to avail myself of the stuff I thought I might be into. I already knew I liked the Beatles—every car ride with few exceptions was an extended Beatles session—but I had never really listened to them on my own. They were a good place to start.
At some point I had also gotten a better tape deck, one that actually had two speakers for stereo. I had no idea what difference that made because, again, most of what I was listening to was recorded from the TV speaker and into a tape deck’s built-in microphone. Also, I think around this time the “Walkman” had probably stopped working. One evening, I decided to listen to a Beatles tape—a real “dub” from a real record—through a set of cheap earbud headphones plugged into the tape deck. I don’t remember which album, but I would guess it was one that had some collection of songs I already knew; “Eight Days a Week” rings a bell, so it may have been Beatles for Sale, but so does and “Drive My Car” and “Nowhere Man,” so maybe it was Rubber Soul. But it could also just have been a mix.
I was probably 10 years old, and I still remember the sensation of hearing different instruments and different voices seeming to come from different locations around my ears. The boys sounded like they were singing into one ear, and the drums and guitar sounded like they were coming into the other ear! It wasn’t all just a mush of sounds. It was like a tapestry. Or a stage play, where you see every actor, every prop, every set piece. It was utterly transporting.
(Apparently, I was also singing along and doing a poor job of harmonizing, as my little brother would later complain, “You sounded terrible.”)
I was certainly going to raid my dad’s collection for more music. And I wasn’t going back to sounds recorded from a TV speaker.
This is not the origin story of an audiophile. But by my late twenties and early thirties, when I had a real job and enjoyed a living wage, I did begin to care much more about maximizing the quality of the means by which my ears received the air vibrations of music. I had several agonizing crises over what headphones to buy, which only became more fraught when I’d go into a store and sample headphones well outside my price range. Holy shit, I didn’t know music could sound this good!!! Every uptick in audio quality I experienced spoiled whatever I had previously enjoyed. Hell, just knowing that there were better things out there made it difficult to be happy with whatever I had.
This comes up in countless other ways. When Apple introduced “Retina displays” with the iPhone 4, screens with sufficiently high resolution that individual pixels became indistinguishable in normal use, it ruined me for all other displays. Reading text on a non-Retina screen, something I had been doing for some thirty years without complaint or problem, now felt like having my eyeballs scraped by jagged text.
At least that came from actual lived experience, wherein a demonstrably superior execution of a particular kind of product makes previous iterations look worse, because they are, in fact, worse. But I also bought the iPhone 12 when it came out while I already owned an iPhone 11. Why? Did I have direct experience with the new model, and thereby knew that my current phone was now teetering on obsolescence? Of course not. It was just new, it looked a little cooler, and the marketing all said it was better. You will not be surprised to know that a few months into owning the iPhone 12, it has not provided me a meaningfully better smartphone experience than the iPhone 11. Nor whatever phone I owned before that. But just being aware of new things, by comparing what I have to what might be, often convinces me, with great conviction, that I must acquire it. I had been fine before. Cursed with new knowledge, however, robbed me of that contentment. Or, more accurately, I let it myself be robbed. I invited the burglar in and offered him a sandwich.
I recently bought a new laptop. Or, rather, I should say I recently bought three laptops, because the first two, since returned, had flaws on their displays. Since this was intended for gaming, I had made peace with needing to settle for an eye-scratching, non-Retina display, something years of Macs, iPhones, iPads, and Galaxies had ruined me for.
But the first laptop had what seemed to me to be a lot of what is called IPS glow, the light haze around the borders of the screen. This is apparently fairly normal for these kinds of machines, but I had now been conditioned by Retina displays and quad-HD OLEDs, so it seemed unacceptable. The next laptop had a slight blemish of light bleed in the center of the display (I think that one was a reasonable rejection). The last one, the one I’m using now to type this, was just fine. More expensive, of course. It does have a tiny bit of light bleed at the very bottom corner, but I am trying to be okay with it.
This will seem like a divergence, but it’s not. I have spent a lot less time on social media in the last few weeks. Like many folks, this is in part because the world seems less on fire than it did a few weeks ago, and doomscrolling the news feels less necessary. It never was necessary, of course.
Apart from lowering my overall anxiety levels about the state of civilization (which, for the record, remain high), social media distancing has also made me feel less compelled to produce. Longtime readers of mine (all six of you) will know that I am gripped by a feeling of obligation to make myself relevant through my creative work, be it my writing, music, or other endeavors. I often feel that in order to justify my existence, in order to atone for all the things I never accomplished when I was younger, and in order to feel that I have somehow mattered, I must Become Known and cross some threshold of relevance and significance.
In recent weeks, though, I have felt this pang much less sharply than usual. It could merely be the fact that I feel very wrapped up in my gaming, which is a new thing for me. It could also have something to do with the changing political reality.
But I think that it’s mainly because I’m looking at Twitter way less.
Think back to the beginning of the pandemic. Suddenly everyone (it seemed) was part of this renaissance-of-the-remote, people writing novels and cooking and painting and performing and streaming and Tik-Tokking, and I’d be damned if I was going to be left in the creative dust. I started a newsletter. I revived my blog. I recorded some music. I made videos. I was going to matter.
I felt like I had to matter. It was a kind of race, one I had already been losing, when suddenly a lot more runners appeared on the track. Hey! Slow down up there! I was trying to matter first!
I was comparing myself and my output to what I saw from social media. But of course, what I see on social media will always tend to be stuff from those who have already broken through. Of course I was seeing things from people who were already in a position to create. Of course I would see things from people who already had the freedom to churn out content.
Yet I compared. I compared and found myself lacking. And as Dogberry says in Much Ado about Nothing, “Comparisons are odorous.”
When I was listening to the Monkees on my old mono tape deck from sounds recorded from a cable TV rerun, I had no idea what I was missing. Comparing my tapes to the stereo sounds of a professionally produced album out of actual headphones, I learned there was a deeper, more meaningful experience of music to be had. It was good that I compared one to the other. But I was also happy before.
Just because something is better (in fact or in perception) doesn’t make it necessary. Nor does it even necessarily make it a good. And it definitely doesn’t make it necessarily good for me. Or you, I bet.
If you find this newsletter or anything else I produce useful, perhaps you’d consider tossing some currency my way.