The Internet is a useful tool for
documenting the past and present. It “accurately” records everything we post
and sometimes organizes it for easy retrieval. For the most part, we love this ability to catalogue the past
for quick and easy reference. That’s why apps like TimeHop and features like
Facebook’s “On This Day” have become so popular. It keeps track of my past
experiences so I don’t have to. I use these features every day. Sometimes I’ll stumble across an interesting
picture or status update that I had forgotten about and made me smile or made
me remember more of a past memory. Most of the time, I cringe because I was a
really stupid teenager (But weren’t we all?). I do think it’s extremely
interesting what kinds of memories many of us choose to share with our friends
and the Internet. Many of them do tend to be some embarrassing memories (I know
I’ve shared quite a few), but that’s a different although related topic.
Social media has especially allowed
us to keep a more detailed record of the past as a form of memory, even if we
exclude services like Facebook’s “On This Day.” Anyone with privacy permissions
can scroll back on a person’s social media timeline and access previous posts,
pictures, videos, links, and other shared posts. Commenting or liking these
(particularly on Facebook) creates a notification for potentially several
people, bringing that memory back into the forefront as people click the
notification and look at the new activity. These posts even appear on other
people’s news feeds, and those who were not apart of the memory originally now
have access to it and can choose to become a part of that memory, even if that
participation is more reserved.
Technology, from social media to
documents on a computer, allows users to create a “perfect” representation of a
memory, in the sense that they can recapture an unbiased, superficial
perfection of the events through text, audio, and visual means of recording. They
record exactly what you wrote, what you said, how you acted or looked like, but
they cannot capture felt emotions, feelings of the participants, smells (yet),
feels (as in touch sensations, not the things that fangirls get when something
emotional happens to their favorite characters or on their favorite
shows–although I suppose technology cannot truly capture those feels either),
and all other sorts of sensations. It only captures the surface level. It is
also subject to tampering. Because everything digital is essentially made up of
a bunch of zeros and ones, someone with the know-how could easily change that
information, even if they had to hack into a server. Photoshop exists and has
been used many times to purposely deceive. Even deleting photos or posts
changes the memory because it pretends that moment did not exist. In these
instances, the original “memory” has been changed and altered into something
false.
Alternatively,
human remembering also has its positives and negatives. Through our own memories,
we can connect to the emotions we felt during the event or moment. In the cases
of extreme emotion in the moment, such as trauma, thinking back to the memory
can trigger those emotions all over again. These memories often feel much more
real than technologies’ portrayal of the memory. Of course, human memories can
also be tampered with. Like when (HARRYPOTTER AND THE HALF-BLOOD PRINCE SPOILER) Slughorn changes his memory so no
one knows he was the one to tell young Tom Riddle about horcruxes and how they
work.
(Ignore the Hebrew subtitles)
But more seriously, as time goes by, and the memory becomes
more and more distant, we can start to forget small or even large details about
what happened, even provided we originally remembered it “correctly.” I say
correctly in quotations marks because two people could have different
interpretations and memories of the same moment, and both could be equally
valid. Memories are made up of personal experiences just as much as they are of
what technology could report. For certain traumatic events, these memories
might even be blocked subconsciously for the overall mental health of the
individual. In these cases, the memory has been tampered with, although not as
purposefully as when it has been changed within technology.
Even our present and future
experiences change how we view our memories, and so in turn change the memories
themselves. We find this happens a lot when adults try to remember their own
childhoods. If they haven’t forgotten specific memories completely, they definitely
have forgotten the emotions and experiences that used to be so strongly
attached to those memories. It’s hard for them to remember how they felt when they
were that age and so have difficulty effectively and compassionately
interacting with children and teenagers. A bad breakup essentially taints what
used to be happy memories from the rest of the relationship. The emotions
associated with these memories change drastically based on future experiences.
Even though
human recollection might feel more
real to the individual, society as a whole has begun to rely most heavily on
digital memory, especially for support and evidence. When an argument takes
place or a court trial is taking testimonies, digital recordings, screenshots
of social media posts, and other documents hold much more weight than the
memories of any one witness. These memories provided by technology are the
end-all-be-all when it comes to successful support. Our culture privileges
“perfect” recollection over subjective human recollection. As Carter Hanson
notes, the society in M.T. Anderson’s novel Feed
exemplify this mindset to the extreme. We value objectivity much more than
subjectivity. (That’s also why STEM subjects–science, technology, engineering,
math–are much more valued by our society than the humanities. Of course, we
English majors know they’re wrong.) We see portrayals of these kinds of
memories in movies. As Hanson writes, “A common conceit of memory flashbacks in
Hollywood cinema is that a character’s recollection of a past event plays just
like a film, as if the character were able to hit a memory playback button and
relive a memory frame by frame” (265). We idolize this type of memory, because
it supposedly depicts exactly what happened the way it happened. In movie
flashbacks, there is very little room for error.
The problem with this emphasis and preference for sole
digital remembrance (rather than just using it to supplement our own memories,
like we have done since the invention of the written form) is that the act of remembering
actually stimulates our brain and thought processes. Hanson argues, “Rather
than freeing up additional brain space for more creative work, offloading the
task of remembering onto computers inhibits our long-term memory from forming
the ‘complex concepts, or ‘schemas’ . . . [that] give depth and richness to our
thinking’” (266). The common argument for not needing to remember information
is that “Well I can just Google it. Why do I need to remember it?” The same
argument happens in school, particularly in math classes. Why do we need to
learn how to do all these different problems that we will never use? Well, the
answer is because, by learning the processes and problems, we improve our logic
skills, which improves our critical thinking skills and makes us much smarter
in other areas of our lives. By purposefully trying to develop our memory and
remembering skills we expand our knowledge. Again, Hanson argues that, “While
storing memory in gigantic quantities may alleviate certain anxieties, Nora
suggest that the movement of memory “from the inside” to externally archived
traces displaces the ‘responsibility of remembering’ onto the archive itself,
causing memory to lose its centrality as an individual or social engine” (267).
The memories become the property of all the people who encounter it. It is no
longer any one person’s memory, and as such, the person slowly loses that which
makes him or her unique. Our memories stem from our experiences. If we lose
those memories, we lose how the experiences changed us, and then we lose
individuality.
All because we let TimeHop and Facebook take over our
memories.
(Okay, maybe that’s a little extreme.)
Kristie, you make a great point in bringing up the advents of things like TimeHop and the "This Day X years ago" type features that social media websites like Facebook have come up with and utilize. Although Facebook keeps your timeline available to check, most people don't have a habit of dredging back 5 years through their timeline to see what they posted 5 years ago. These advents have made it so that people don't have to actively remember what happened when it's available for them to remember simply by utilizing these features or manually digging back through their timeline.
ReplyDeleteI also find the point of digital remembrance in film to be concerning (as you note from Hanson), especially with how it is displayed in film and television. As you mentioned, it is often shown as a sort of "mini movie" within the movie, with characters somehow managing perfect recall of moments that seems somehow to fit perfectly with the situation. One film that I feel portrays flashbacks well is The Usual Suspects (Singer, 1995). The film is being told from the perspective of Roger "Verbal" Kint (Kevin Spacey) to detectives as a series of flashbacks and pseudo-memories, where by the end of the movie, the viewer becomes unsure of the validity of the events that occurred throughout (takes "unreliable narration" to a whole new level). The film Big Fish (Tim Burton, 2003) also does well with the idea of flashbacks as a form of memory. The main character, Will, has been told stories all his life by his father, who tells him of the amazing adventures he's taken, crazy people he's met, and fantastic places he's been. (SPOILER ALERT) But remembering all these fantastic details, Will sets out to learn the truth about his father's stories, but finds them to be just that -- stories. All the amazing details that he remembered from his father's stories were just the workings of a storytelling father to an imaginative little boy.
I agree with your section quite a bit. Facebook, Instagram, Twitter, and all other social media, while helpful and creative for many things, have become a crutch for people to post every little aspect and detail about their lives. Rather than living in the moment and remembering what they have seen or done, people use technology to record and take pictures of and document their entire lives now. Why try to remember the things they've done or places they've been when they can just look at all the pictures they have on Instagram or the thoughts they've shared on Facebook during the experiences?
I personally find, although on McKenzie's blog post I railed on the idea of these advents to experience time, to be somewhat helpful, at least when it comes to writing. I'm still against using these devices to aid memory and odds are slim you'll catch me taking photos and videos when something exciting is happening, but when it comes to writing non-fiction, these things do indeed help. I found that quite often, at least writing on my recent past, that I can find photos on social media sites, along with some video, that help me remember details I may have forgot. The farther back I go though, to the time where these things weren't used, I find myself struggling to find material I need. Social media in one respect does help to organize any collected data you may have had when it came to the past. Usually when I try to find a photo before these sites, I have to take the long trip back to my parent's home. From there, it breaks down to going through photo albums, home videos, and more to find just one thing I'm looking for. Don't get me wrong, I love seeing all these things from my past, but it takes a lot longer than the time I want to spend on it.
ReplyDeleteMaybe this is the dilemma that technology is really presenting to people. It's not the fact that things can be distorted, or takes people out of the moment, but rather it ruins taking the time to cherish thought and memory. Why waste time searching through old books and videos when Google or Facebook can find something for you in half the time?
I've written a lot about YA dystopia and its particular popularity in the 21st century. The ways in which these texts both are political are are watered down is interesting to note, as is the fact that they're targeted at a population people like to bemoan as apolitical or apathetic. Like all my scholarship, this available to read online and is up on The Keep & Academia.edu: http://works.bepress.com/melissa_ames/1/
ReplyDelete