How Skimm’ing is Changing the Ways we get the News

As an International Relations and Economics double major I pride myself for knowing what’s going on in the world outside the Tufts bubble. As a college student, however, it’s easy to get bogged down with the college lifestyle and feel out of touch.

So, when I heard about theSkimm, I was interested. theSkimm is a free daily newsletter that provides a concise summary of current events and news stories. Currently, it has over half a million subscribers with its customer base rapidly growing. Snappy headlines with pop culture references such as “bye Felicia” and “pick me, choose me, love me” attract a key demographic: busy, educated millennials in the work force.

I heard about theSkimm from one of my friends who happened to be interning at their Boston office over the summer. I liked the idea of a free daily newsletter summarizing that day’s news stories, and decided to subscribe. Before I knew it, I was checking theSkimm every morning. I, like many other customers, was drawn in by its readability and its apparent lack of media bias.

This being said, I’m not 100% ready to call myself a Skimm’er. As the world saw a major transition from print to digital journalism, there was a major change in the way people receive the news. Although it is still very much in its early stage, theSkimm raises questions about the future of digital journalism. With this convenient resource, will people still look to news sites for their daily fix of current events? Will people be less likely to delve into certain worldly issues, and lead to a less informed voting base? Or will a consolidation of the news and an increase in accessibility lead to a more informed public?

Shivani Shendye, A17

The Faults of Facebook

Only a few short years ago, Facebook reigned supreme, providing the masses with not only an outlet to communicate, but also a means of checking what Suzy wore to the prom in 2008…in case anyone forgot. While it may be that Facebook, as a domain for communication and information sharing, is still thriving, the site has taken on a number of more negative “statuses” in recent years.

Perhaps the harshest of these new reputations is that Facebook is becoming uncool. That’s right, Facebook—the multi billion dollar social media giant with 802 million daily active users—is uncool. Although the losses haven’t seemed to affect Facebook in the slightest, a study conducted by researchers John Cannarella and Joshua Spechler from Princeton University controversially noted that Facebook is on a downward trend of popularity. Additionally, analyst reports, as well as the Facebook CFO himself, have stated that more than 11 million young former-Facebook users have left the site since 2011.

Many Facebook users, in the midst of finals week or looking to get a better handle on their time management, will self impose a Facebook cleanse, temporarily deactivating their accounts for a period of time until they miss that familiar shade of blue. Aside from this practice, though, what is causing young users to deem Facebook unimportant and leave the site has recently been called into question.

One theory for this decline is that Facebook is straying away from its origins as a tool for communication and is becoming a host for unwanted parasites such as spam advertisements, or worse, targeted advertisements. Such advertisements record our online activity—what we like on Facebook and even which websites we view off of Facebook—and then use that information to provide us with advertisements that may suit our preferences. So, for example, if you search for a hotel in New York City, you may see ads for hotels in New York City on your Facebook page. While this may be beneficial for some, others see it as an eerie invasion of privacy and a possible deterrent from using the site, which already provides more than its fair share of spam by way of that friend who is constantly posting updates about what he ate for breakfast.

Another possibility as to why teens are leaving the site is that Facebook makes us sad, as brought to light by a study conducted at the University of Michigan. The reasoning behind feeling the Facebook blues is that it incites social comparison—if you weren’t present at the event that everyone is posting about, unhappiness may set in when you click through the posts and pictures. Since the Internet first rose to popularity, researchers have backed the theory that our increased online presence and decreased personal interactions have heightened our susceptibility to both depression and loneliness.

While what may be triggering teens to leave Facebook is debatable, what is certain is that Facebook usage isn’t always beneficial. What comes with the perks of using the site is a responsibility to be aware of Facebook’s faults. Even though seeing Suzy’s prom dress from 2008 may bring back fond memories of high school, too much Facebook-ing could come at a price.

— Lauren Witte, A14

The Academy Awards Diversity Gap

I’ll be the first to admit that regardless of all its hype and elitism, I love the glitz and glam of the Oscars. I am completely enamored with the red carpet coverage, the cheeky hosting, and the emotional acceptance speeches. But it isn’t hard to see that the people accepting those awards have been, well, monochromatic. I’m talking white and for the majority, male.

Just in time for the Awards this Sunday, Lee & Low Books, an independent children’s book publisher specializing in diversity, released this infographic about the lack of diversity in Academy Award winners since the inception of the Awards 85 years ago. Whether it’s what you’d expect or not, the results are devastating. Only one woman of color has ever won an award for Best Actress, and that was Halle Berry in 2002. Only seven men of color have ever won for Best Actor, the last one being Forest Whitaker eight years ago, and only one woman has ever won for Best Director, Kathryn Bigelow. The Academy voters as well are overwhelmingly white males.

Some people may say, “but look, this year three minority actors are nominated, one being a woman, and they have good chances of winning!” As much as I admire these actors and their performances, there’s still one thing that bothers me about the roles for which they are nominated. Each of their roles specifically called for an actor or actress of their race, meaning there was no way a white person could have been cast for those roles anyway. In an interview for Lee & Low, actor/writer/director Jason Chan explained that when it comes to casting, “The default is always Caucasian unless something else is specifically asked for.” What I would like to see is a person of a minority background getting recognition (and I mean the kind of mainstream recognition that the Oscars provides, because let’s face it, unless you’re a film buff, that’s what you’re paying attention to) for a role that could be ethnically ambivalent. We need to show that not only white actors and actresses are capable of taking challenging leading roles and of being role models to the millions of people who watch and admire them.

I know that the Academy is not the be-all, end-all of the film world. Thankfully, independent filmmaker Gina Prince-Bythewood (Love and Basketball, Secret Life of Bees) states in the same interview with Lee & Low that the independent world is many steps ahead of major studios in including diversity in their productions, both in front and behind the cameras.

Once we see more diversity on the production side, maybe we’ll see the same on screen. The recent census shows that for the first time, the majority of babies born (50.6%) in the United States are minorities. When will the media start reflecting this change as well?

-Julie Takla ’16

Rom-coms, Valentine’s Day, and How We’re Spending a Lot Without Buying Anything

Last week people all over the country celebrated the most romantic and undoubtedly commercialized holiday we know of: Valentine’s Day. From jewelry to Hallmark cards to dinner dates, consumers spend around $17.6 billion on this one day of the year. Of course, also on the profiting side is the film industry. Whereas movies pertaining to other holidays only come once a year, such as Christmas movies, Valentine’s Day movies can be seen all year round. They’re called rom-coms.

The romantic comedy has always been a successful hybrid movie genre, and it’s not hard to see why. With their simple storylines, (highly predictable) twists, (sometimes overdone) comic relief, and (too-good-to-be-true) happy endings, they portray a world we can only dream to live in. Yet how are these exaggerated realities affecting the way we live our lives? Some people believe that romantic comedies have ruined their relationships because they teach audiences to expect too much from their romantic partner. Watching two actors play out the most romantic–and fictional–days of their lives can remind us viewers how grateful we are for the people around us, or give us a reason to adopt another cat. Whether we love Valentine’s day or love to hate it, movies, especially romantic comedies, play a big role in shaping our perceptions of the holiday. Or do they?

According to a 2013 study at University of Illinois at Urbana-Champaign, watching romantic movies does not actually give people unrealistic expectations for their relationships.The study identified four romantic ideals found in rom-coms: love at first sight, one and only soul mate, love conquers all, and idealization of partner. However, only the last had a significant correlation to the frequency of intake of romantic comedy films. Most people do not believe they’re going to find “the one” and their lives will be complete just because a movie says so. And most people do not expect to find a perfect partner who will lie in the middle of the road with them The Notebook style, unless they actively watch romantic comedies for dating and relationship advice. Studies show that people who watch romantic movies in order to learn from them are more likely to idealize their partner and believe in other romantic ideals.

It makes sense: those who go into a movie looking for some kind of message or answer will find what they are looking for. It all depends on intent and the ability to project yourself onto the main character.

Which brings me to another question: whose lives are being portrayed on screen? The film and television trope of the white male lead pursuing the white female lead (or vice versa) is so common that we often don’t stop to think about who’s missing. Where are the racial minorities, the interracial couples, and the same-sex couples? If romantic comedies are portraying the image of the ideal lifestyle and relationship, what does that mean for the viewers who do not fit the white upper-middle class heteronormative profile of the lead characters?

Maybe it’s because we are not all seeing ourselves onscreen that we are not all falling for these ideas of what love and romance are supposed to be. And yet, we’re still spending so much money when we know that what we’re buying is fake: we won’t get perfect endings or perfect partners, and no amount of chocolate or roses is going to get you that perfectly timed promotion or change a person’s entire personality. So why continue this practice? If I were to take a guess, I’d say it’s because for one day, it’s fun to feel like you’re living in a movie. For just one day, cheesy rom-com rules apply: it’s acceptable to recite 19th century love poems, or even write your own. Giant heart bouquets of red roses aren’t too over the top. The extra money for more elegant dinner isn’t too much to spare for one night.

Romantic comedies will continue to allow people a few hours of respite into a perfect world and show us what we want to see. So maybe we’re not as deluded as we previously thought, and we don’t really need romantic movies to tell us how to handle our relationships, but that didn’t stop me from watching The Wedding Singer with my roommates last Friday night.

 

–Julie Takla ’16

Do Awards Shows Really Take the Prize?

While many of us are gearing up (or gearing down) for the end of the current cold season, those in the entertainment industry have been eager about a different type of season. This is a season that sparkles with glitzy dresses, shining smiles, and over-the-top extravaganzas. This is a season of celebration and commemoration. This is what Hollywood refers to as “awards season.”

Awards season provides celebrities, entertainers, and artists alike the chance to exhibit, showcase, and applaud each other for the year’s work in the form of small trophies and accolades. From the People’s Choice Awards at the beginning of January through the Academy Awards in March, the major awards events (including the Golden Globes, Grammys, Critic’s Choice and Screen Actor’s Guild in addition to the two mentioned above) flood our television screens. For those who can’t get enough, there are countless more.

All of the hubbub surrounding these awards merits the question: are they worth the hype? For the networks televising the shows, the answer is a resounding “YES!” The awards shows draw in mass quantities of viewers, with the numbers at record high this year for the shows that have aired thus far. It is simple to explain why the shows are so entertaining. Who doesn’t want to see their favorite celebrities donning tuxes and gowns worth more than we can imagine?

Amidst all of the hype however, lies a darker side to the glitz and glam. Besides, perhaps, causing an annoyance for journalists, the major awards shows often paint a one-sided picture of the industries they wish to highlight. For the film awards, this means a lack of recognition of anyone not a part of the “big five,” as I call it—writer, director, producer, actor, and editor. While the additional categories exist—cinematography, visual effects, sound design, music, and costume design (among fourteen others)—only the majors are broadcast during the events. This leads to the assumption and reinforcement, for the general population who does not know about film production, that these faces of films do it on their own, when in fact it takes teams of hundreds if not thousands or more to make a movie happen.

Music awards shows—most notably the Grammys—often marginalize the vastness of the art, only televising awards for best pop, rap, country, and R&B albums and records. Although they also award beyond these categories, the show often turns into a radio-love-fest, with the year’s top radio hits sweeping the televised categories and leaving one to wonder what happened to all of the other music.

While televised recognition is not necessary for appreciation of these forgotten categories, recognition is. I, too, love indulging in the fame and fun of the countless red carpets, but the shows should be viewed as merely a carefully crafted sampling of the talent that exists in the entertainment industry. So as you fill out your Oscar ballots and gossip about the Grammys, remember—‘tis the season for a broader appreciation of those who work to keep us entertained.

Lauren Witte, A14

Jan. 31, 2014

Secret Service: government manipulation of news media

Last Thursday, Islamist website RNN leaked a top-secret 6-minute video obtained from “sources in the military.” As the New York Times reported, the video depicts “senior Egyptian Army officers debating how to influence the news media during the months preceding the military takeover” and “offers a rare glimpse of the anxiety within the institution at the prospect of civilian oversight.”  While they have been achieving “better results,” Gen. Abdul-Fattah el-Sisi says in the video that they haven’t reached what they want. (Note: seriously, read this article if you’re interested in media. It’s fascinating)

But propaganda is not a recent invention, though its manipulative messages have become even more cloaked in recent decades, thanks to social psychology research into effective techniques. In addition, as the officers say in the video, the Egyptian government has a history of manipulation of the press, as do China, Russia, Japan, and Great Britain (among many others). But although these instances may affect our country’s foreign relations and our perception of global events, because we have a democracy and freedom of the press, our domestic news media is safe, right? Actually, wrong.

In 2008, the New York Times successfully sued the U.S. Department of Defense, gaining access to more than 8,000 pages of records, from transcripts to e-mail messages, describing a multi-year effort that the military called a “talking points operation.” The result: this article, detailing the process by which the Pentagon hired “surrogates” or “message force multipliers” who would appear in the media as un-biased “military analysts.” However, the cost of unparalleled access to military secrets and additional financial compensation was an agreement to spin the situation positively, in favor of the American military. Regardless of what was actually happening. As David Barstow wrote for the Times, “these members of a familiar fraternity” became a “media Trojan horse” — shaping the tone and conclusions of our supposedly fair and accurate terrorism coverage. Even more distressing, most of the news networks didn’t look into the analysts’ personal or business connections to the military when hiring them to provide opinions.

At the end of the day, since we cannot predictably rely on the objectivity or even accuracy of news media, it’s up to us as the consumers to responsibly and responsively consume that media, whether in Egypt or the US. A former analysts is quoted in the Times article as saying, “The worst conflict of interest is no interest.” He was referring to his own hire, as well as those of his colleagues, to share warped opinions at major national news networks. But the soundbite holds true for the modern news reader/watcher/listener: don’t accept things the way they are first presented. Do your research, consider your own values, and then reconsider the coverage. The news is just a report, not a dictation of how to view the world.

(Note: you can view all 8,000 pages of released documents here once the government is no longer shut down)

–Gracie McKenzie, A15

Stopping the Presses: student publications and new media

Consider this blog post. No, really. In the past, a forum like this one—a blog for, by, and about an academic department—simply wouldn’t have existed. But, as we’ve all heard a million times, the world is changing, thanks to the Internet. For some college newspapers, though, the Internet is more than the new frontier; rather, it’s the only place their product exists.

This phenomenon is nothing new, as professional publications have made the switch in an effort to save money for years. The most famous of this bunch is Newsweek, which published in print for 80 years before its changeover and rebranding in October 2012. But the financial crisis hit student publications later and in a different way. In fact, as late as 2009, over a third of college newspapers didn’t have a web presence at all (defined as no website or a website that hadn’t been updated in six months or more).

The University of Oregon’s Daily Emerald is not one of these publications. After 92 years of printing daily, in 2012 the newspaper transitioned to a twice-weekly magazine with most content published online. Eastern Connecticut State University took that one step further. In September 2006, students returned to newspaper racks filled not with a new edition, but instead with fliers informing them that news would now only be available online. While ECSU was the first school to literally “stop the presses” (which actually raised legal trouble), others have followed in more recent years, especially on the community college level. Here at Tufts, while many publications have an online presence, none are completely online.

In the end, though, it’s important to remember the purpose of student newspapers. Yes, a record of happenings, on-campus and off, is valuable for the campus. But, more specifically, student publications in general provide outlets for practice, training, and experience in the field of journalism. The act of writing the story may prove more important than the content of that story, even if the byline is online. In fact, in an era in which journalism is increasingly on the Internet and/or multimedia, one could argue that it’s worse for a student publication not to have a web presence at all than not to publish in print.

However, even though funding for college newspapers can be tight, I would argue that it is almost as irresponsible to abandon print journalism completely. Call me a print media junkie, but nothing compares to holding the news in my hands, without a dim, bluish glow. I want to turn the page to see the next section, instead of just scrolling down. And I’m not alone: according to research, students prefer the print version of their campus newspapers, and in 2010 over half of college students didn’t even know if their college newspaper was available online. When budget cuts and lack of profit threatened to shut down American University’s weekly Eagle, student response was so negative that they decided to make the changes necessary to allow at least a monthly publication.

But the Emerald didn’t change media because of financial difficulties—2011 was one of their most successful years ever. Instead it was because they saw a shift in professional journalism, and adjusted their mission and actions to better fit the modern world. In the end, digital news media is neither an add-on nor a replacement for its print counterpart. When the two exist simultaneously, they complement each other, and students aiming for careers in journalism should be comfortable on either end. Because, right now, that is the future. Print publications may be struggling financially, on the whole, but they need young, bright minds with experience in both realms in order to keep functioning. Maybe it will change again: perhaps print journalism will truly become a thing of the past, and we’ll all carry our student publications around on tablets, and every academic department will have a blog. But today, this semester, this year, we still exist in a blend of the two forms, and thus student journalists need experience in both.

–Gracie McKenzie, A15

Social Media and the Syria Crisis

It goes without saying that the events of the Arab Spring will leave a historical legacy of numerous dimensions, impacting permanently how we think about global geopolitics, ethnic/sectarian politics, energy politics, and such concepts as “popular uprising,” democratization, state sovereignty, and international law and the laws of war. All of the states involved in the Arab Spring, including Syria, have also demonstrated that the role of social media in mass movements is a phenomenon that cannot be ignored, whether or not one believes that it can really be credited with any of the success of these movements. (As one early commentator questioned, “Is all you need to topple an entrenched autocratic regime a collection of Facebook updates, YouTube videos and Twitter hashtags?”) It is true that, as has been pointed out, Facebook’s highest-growth markets are the Middle East, Africa and India.

Regardless of one’s views on whether social media’s role has been positive or negative in Syria, it is clear that if nothing else social media has shaped reactions to the conflict all over the world. Videos posted on YouTube have prompted domestic reactions inside the US, specialist internet bloggers from around the world have apparently provided important weapons analysis to governments and human rights groups based on their monitoring of social media, and global leaders such as David Cameron have kept the world apprised of their actions through Twitter (tweets David_Cameron: “The use of chemical weapons in Syria is wrong – and any response wound [sic] have to be legal, proportionate & designed to deter further outrages”). One site has collected an interactive map showing all of the latest tweets on chemical weapons in Syria and the international response (example: “war is business and its [sic] poor people that pay the price, feeling sorry for the people of #Syria #Iraq and the whole #Africa”).

Do you think social media has played an important role in the Syrian crisis? Is this positive or negative? We’re interested to hear your thoughts!

Changing the Way We Chart Music

Beginning this week, Billboard Magazine will incorporate YouTube plays as part of the metric it uses to determine its weekly singles chart, as reported in yesterday’s New York Times.

Coming on the heels of news that the Times is putting The Boston Globe up for sale again, it’s almost too much to take in a single news week.

Indeed, the times they are a-changin’ (stopping to ponder whether a visionary like Zimmerman saw this coming). Yet I must admit, I certainly didn’t, even though with videos like “Gangnam Style” and “Somebody That I Used To Know” garnering more views than the population of most of the world’s continents, I also admit I should have.

Here’s where the nostalgia sets in: I remember vividly tuning into pop radio during my preteen years, counting down America’s Top 40 with Casey Casem, wondering if my favorite artist would claim the top spot that week, and if not, how far their song would climp or (gasp) dip. But that was a long time ago, when MTV played music and the best video games were six feet tall and weighed 300 pounds.

And so I come to the realization that MTV is dead, and so are arcades, and so goes the way we consume popular entertainment these days. I guess it’s healthy in way, in that one could potentially vie for pop stardom from their bedroom or basement, remaining (at least, for a while) free from the trappings and creative constraints of big-time record companies and their bean counting partners in crime.

And if something tangible has been lost in all of this – the spinning of a 45, the careful construction of mix tape for your best friend, or the sights and smells of a freshly opened gatefold of liner notes – something has likely been gained, even if I’m not entirely certain what that something is. Perhaps its choice.

Which reminds me of what my son chose to pull out the other evening in lieu of listening to music on his  iPod, smartphone, or, computer, or Playstation – the Motion Picture Soundtrack to Saturday Night Fever – on vinyl, no less.

At 14, I only dreamed of such a smorgasbord.

–John Ciampa