The Internet is a useful tool for documenting the past and present. It “accurately” records everything we post and sometimes organizes it for easy retrieval. For the most part, we love this ability to catalogue the past for quick and easy reference. That’s why apps like TimeHop and features like Facebook’s “On This Day” have become so popular. It keeps track of my past experiences so I don’t have to. I use these features every day. Sometimes I’ll stumble across an interesting picture or status update that I had forgotten about and made me smile or made me remember more of a past memory. Most of the time, I cringe because I was a really stupid teenager (But weren’t we all?). I do think it’s extremely interesting what kinds of memories many of us choose to share with our friends and the Internet. Many of them do tend to be some embarrassing memories (I know I’ve shared quite a few), but that’s a different although related topic.
Social media has especially allowed us to keep a more detailed record of the past as a form of memory, even if we exclude services like Facebook’s “On This Day.” Anyone with privacy permissions can scroll back on a person’s social media timeline and access previous posts, pictures, videos, links, and other shared posts. Commenting or liking these (particularly on Facebook) creates a notification for potentially several people, bringing that memory back into the forefront as people click the notification and look at the new activity. These posts even appear on other people’s news feeds, and those who were not apart of the memory originally now have access to it and can choose to become a part of that memory, even if that participation is more reserved.
Technology, from social media to documents on a computer, allows users to create a “perfect” representation of a memory, in the sense that they can recapture an unbiased, superficial perfection of the events through text, audio, and visual means of recording. They record exactly what you wrote, what you said, how you acted or looked like, but they cannot capture felt emotions, feelings of the participants, smells (yet), feels (as in touch sensations, not the things that fangirls get when something emotional happens to their favorite characters or on their favorite shows–although I suppose technology cannot truly capture those feels either), and all other sorts of sensations. It only captures the surface level. It is also subject to tampering. Because everything digital is essentially made up of a bunch of zeros and ones, someone with the know-how could easily change that information, even if they had to hack into a server. Photoshop exists and has been used many times to purposely deceive. Even deleting photos or posts changes the memory because it pretends that moment did not exist. In these instances, the original “memory” has been changed and altered into something false.
Alternatively, human remembering also has its positives and negatives. Through our own memories, we can connect to the emotions we felt during the event or moment. In the cases of extreme emotion in the moment, such as trauma, thinking back to the memory can trigger those emotions all over again. These memories often feel much more real than technologies’ portrayal of the memory. Of course, human memories can also be tampered with. Like when (HARRYPOTTER AND THE HALF-BLOOD PRINCE SPOILER) Slughorn changes his memory so no one knows he was the one to tell young Tom Riddle about horcruxes and how they work.
(Ignore the Hebrew subtitles)
But more seriously, as time goes by, and the memory becomes more and more distant, we can start to forget small or even large details about what happened, even provided we originally remembered it “correctly.” I say correctly in quotations marks because two people could have different interpretations and memories of the same moment, and both could be equally valid. Memories are made up of personal experiences just as much as they are of what technology could report. For certain traumatic events, these memories might even be blocked subconsciously for the overall mental health of the individual. In these cases, the memory has been tampered with, although not as purposefully as when it has been changed within technology.
Even our present and future experiences change how we view our memories, and so in turn change the memories themselves. We find this happens a lot when adults try to remember their own childhoods. If they haven’t forgotten specific memories completely, they definitely have forgotten the emotions and experiences that used to be so strongly attached to those memories. It’s hard for them to remember how they felt when they were that age and so have difficulty effectively and compassionately interacting with children and teenagers. A bad breakup essentially taints what used to be happy memories from the rest of the relationship. The emotions associated with these memories change drastically based on future experiences.
Even though human recollection might feel more real to the individual, society as a whole has begun to rely most heavily on digital memory, especially for support and evidence. When an argument takes place or a court trial is taking testimonies, digital recordings, screenshots of social media posts, and other documents hold much more weight than the memories of any one witness. These memories provided by technology are the end-all-be-all when it comes to successful support. Our culture privileges “perfect” recollection over subjective human recollection. As Carter Hanson notes, the society in M.T. Anderson’s novel Feed exemplify this mindset to the extreme. We value objectivity much more than subjectivity. (That’s also why STEM subjects–science, technology, engineering, math–are much more valued by our society than the humanities. Of course, we English majors know they’re wrong.) We see portrayals of these kinds of memories in movies. As Hanson writes, “A common conceit of memory flashbacks in Hollywood cinema is that a character’s recollection of a past event plays just like a film, as if the character were able to hit a memory playback button and relive a memory frame by frame” (265). We idolize this type of memory, because it supposedly depicts exactly what happened the way it happened. In movie flashbacks, there is very little room for error.
The problem with this emphasis and preference for sole digital remembrance (rather than just using it to supplement our own memories, like we have done since the invention of the written form) is that the act of remembering actually stimulates our brain and thought processes. Hanson argues, “Rather than freeing up additional brain space for more creative work, offloading the task of remembering onto computers inhibits our long-term memory from forming the ‘complex concepts, or ‘schemas’ . . . [that] give depth and richness to our thinking’” (266). The common argument for not needing to remember information is that “Well I can just Google it. Why do I need to remember it?” The same argument happens in school, particularly in math classes. Why do we need to learn how to do all these different problems that we will never use? Well, the answer is because, by learning the processes and problems, we improve our logic skills, which improves our critical thinking skills and makes us much smarter in other areas of our lives. By purposefully trying to develop our memory and remembering skills we expand our knowledge. Again, Hanson argues that, “While storing memory in gigantic quantities may alleviate certain anxieties, Nora suggest that the movement of memory “from the inside” to externally archived traces displaces the ‘responsibility of remembering’ onto the archive itself, causing memory to lose its centrality as an individual or social engine” (267). The memories become the property of all the people who encounter it. It is no longer any one person’s memory, and as such, the person slowly loses that which makes him or her unique. Our memories stem from our experiences. If we lose those memories, we lose how the experiences changed us, and then we lose individuality.
All because we let TimeHop and Facebook take over our memories.
(Okay, maybe that’s a little extreme.)