table talks

In her Sept. 15 op-ed “Emory Students Tend Towards Self-Segregation,” Aarti Dureja paints for us a divided Emory, where the high school hope of a campus without the constraints of “popularity, prettiness, (or) appearance” is revealed to be nothing but a pipe dream. Instead of looking up and out, Dureja writes, we look down and in. We’re governed by our “own social hierarchy.” We stand proudly under the glowing banner of “diversity” but find ourselves in stratified pockets, far from those who aren’t like us.

While we’re not sure “segregated” is an apt descriptor of Emory, Dureja is onto something: there are quiet but very real social constructs at play on our campus. If you’re a white Anglo-Saxon kid in Greek life, chances are that you haven’t had an in-depth conversation with the man who makes your omelet in the DUC. If you’re the Sodexo employee, you’ve likely never interacted beyond a cursory nod with the student from Nanking who wears a t-shirt with letters you can’t read.

As Dureja suggests, we’re insular by nature. That’s okay — we do what makes us comfortable.

But there’s more to the story. After freshman year, we begin to solidify our social circles, our organizations, our communities. “Sticking together” in the University setting certainly doesn’t deserve moral indictment. At a very basic level, it’s what we should be doing: fostering our interests and bolstering our own identities by learning from and feeding off of people who are like us.

That’s why Emory has created spaces where unique identities can be cultivated. We have chapels and churches, houses and lodges, labs and dugouts and diverse offices. Those are the spaces where we cultivate difference and celebrate it.

But where’s the intentionally common space at Emory? Where’s the space for the Hindu Students Association to talk to the Bioethics Society? For Emory Pride to mix with Brotherhood for Afrocentric Men (BAM)? For the PhD in the Harris tweed to sit with the mailroom employee in the collared shirt? Sure, we’d like to hope those interactions would happen organically – but often, as Dureja spells out, they don’t.

Fraternities and sororities host mixers; religious groups send representatives to discuss theological questions at the Inter-Religious Council. But Emory has no streamlined framework for social mixing, for conversation or for asking the hard questions to the only people really qualified to answer them.

We’ve heard students charge Emory with a lack of diversity. We disagree. There’s a surplus of diversity, but a deficit of the will and desire to immerse ourselves in it. To ask the right questions, and to listen to the answers.

We have an idea. It’s called TableTalk.

TableTalk is a framework for conversation between groups that would not interact under ordinary circumstances. It’s not an arena for conflict-resolution between communities that have a history of tension. It’s an accessible and intentional context for us to get to know one another, share a meal, ask the questions we’ve always wanted to ask, temporarily leave our realms of comfort and adopt new ones.

This isn’t about Kumbaya. There are neither lofty goals nor visions of rainbow-colored people entangled in one another’s hands. Our hope is not naive. It’s about being honest with ourselves, recognizing that these interactions aren’t happening on their own and taking an earnest step toward changing our broken culture.

We’ve already begun. At one TableTalk, Buddhist and Muslim students spoke candidly about the spectrum of religiosity across their traditions. At another, members of Brothers and Sisters in Christ, a black Christian fellowship, met up with Hillel students and shared biblical passages that spoke to individuals in each group. Both TableTalks ran longer than scheduled. Both introduced participants to someone they hadn’t encountered before. Both left participants asking: “Can we just keep talking?”

TableTalk’s success lies in its simplicity and humility. It convenes and it facilitates – that’s it. Leaders of cultural groups and campus organizations pick another group with whom they believe their members would benefit from sharing a meal and discussion. When we see crossover in interest, the leaders of both groups sit down with each other to draft a list of penetrating questions that will elicit honest, nuanced answers. They invite their members and they lead the discussion. TableTalk just provides the framework, the space and the food.

A TableTalk emerges from the communities’ desires to talk to one another; we give them an excuse to do it.

Join us in creating a space where a Rollins student can share a meal with the driver of her Cliff Shuttle, where Greek women and men can sit down and talk to the leaders of Sexual Assault Peer Advocates (SAPA) about the people for whom each advocates, where a Korean student can be an individual distinct from the amorphous crowd — and where we don’t need to feel “self-conscious” about who we are.

Let’s affirm each of our communities and build a greater one. Let’s grab onto the freedom to ask. Let’s take the laboratory home with us. Because that’s what’s extraordinary about college: when we retire at the end of the day, we keep living here. And here, we’re never really alone.

Join the conversation at

— By Ami Fields-Meyer and Adam Goldstein

Last week, for the first time in over two decades, Emory University fell out of the top 20 in the venerable U.S. News and World Report (USNAWR) rankings.  Although the drop from 20 to 21 is superficially trivial, it is of great symbolic significance given that many students, parents, academicians and college administrators rely on the “top 20” as a rough benchmark for university quality.  But does the drop in the rankings really matter?

In one respect, the answer is “Probably not.” Objectively, the fall-off from 20 to 21 is minor and is plausibly attributable to measurement error and slight shifts in the USNAWR criteria each year.  What’s more, Emory may well be back in the top 20 next year or soon thereafter.  Moreover, as scores of critics have pointed out, the USNAWR rankings are hardly infallible.

They reflect a debatable formula that itself reflects an arcane composite of dubious metrics.  Plus, what’s to complain about? After all, we at Emory are blessed with a lovely campus in a thriving city, a talented student body, many gifted researchers and teachers, bountiful resources and a distinctive mix of a liberal arts and research atmosphere.

Yet, in other ways, the rankings drop is indeed a big deal. Much as the Emory administration is loath to admit it, these rankings impart a sobering reality: Emory is a good university, but not a great one.  To be sure, many interesting and exciting things happen here. But compared with the Ivies, Stanford, MIT, Cal Tech or the University of Chicago, cutting-edge discoveries and breakthroughs at Emory tend to be few and far between.  Furthermore, like it or not, rankings create reality at least as much as they reflect it.  If our rankings continue to stagnate or drop, Emory will find it increasingly difficult to attract the same cadre of high-caliber students and faculty that it has become accustomed to attracting. Incidentally, the USNAWRrankings are hardly alone in raising a red flag concerning Emory’s reputation. Recently, while waiting for a flight at Hartsfield-Jackson airport, I watched a brief video welcoming visitors to Atlanta. As the narrator, Atlanta mayor Kasim Reed, boasted of Atlanta’s “world class universities,” footage of Georgia Tech and Morehouse flashed on the screen. Emory was nowhere to be seen.  Imagine a comparable video of the Boston area neglecting to feature Harvard or MIT, or of St. Louis neglecting to feature Washington University.  It would be inconceivable.

The administration is quick to point that the drop from 20 to 21 is of no great import.  As is its wont, Emory has valiantly attempted to put the best possible “spin” on the new while turning a blind eye to its implications.  As I write this column, the Emory website proudly displays a story trumping the headline, ”U.S. News Ranks Emory among top national universities” while neglecting to inform visitors of our disappearance from the top 20.

Still, the crucial question is not why Emory fell one slot this year. Instead, the question is why Emory has gone essentially nowhere – except slightly down – in the rankings over the past 25 years while several of our peers, such as Duke and Vanderbilt, have increased in the rankings.  When I first joined the Emory faculty in 1994, our university was ranked 17th in USNAWR.  It was a time of enormous excitement and anticipation, and there was widespread talk of Emory being “poised for greatness.” Twenty years later, many Emory faculty members are pessimistic that the words “Emory” and “greatness” will ever appear in the same sentence. Why?

The answer is self-evident to all of those who have followed Emory closely over the past few decades: Emory has not invested sufficiently in academic excellence.  Despite the fact that Emory’s endowment has rebounded and its capital campaign has been strikingly successful, the hiring of new faculty members in the College has slowed to a virtual trickle. Partly as a consequence, the Emory faculty-to-student ratio has climbed from seven to one to eight to one. Fueling the problem, the university has invested much more heavily in Emory HealthCare than in the College, meaning that the hiring of outstanding college faculty has taken a back seat. In addition, for five years in a row, the average college faculty raise pool has been a measly one percent or less, affording department chairs scant leeway to differentially reward faculty members for hard work and scholarly excellence. Not surprisingly, many of our best and brightest faculty members have descended into a state of learned helplessness, apathy and resentment.

To be clear, the fault cannot be laid at the feet of Dean Robin Forman, who appreciates the problem but can only do so much given the limited financial hand he has been dealt.  Instead, as the saying goes, to get the bottom of the problem, we must get to the top of it.

The source of the problem lies squarely with the Emory higher administration, especially President James Wagner, who certainly has Emory’s best interests at heart, and the Board of Trustees. They have been good stewards of Emory’s finances, but it is less clear that they have been good stewards of Emory’s scholarly future.  In many ways, our steady decline in the USNAWR rankings can be viewed as a referendum on their policies. Specifically, the President and Board of Trustees have neglected to grasp the two greatest impediments to Emory’s excellence: complacency and risk aversion.

Emory’s complacency has been apparent in an absence of urgency on the part of the higher administration. Although Emory has an admirable strategic plan for faculty growth, it will be difficult to sustain without a tangible financial commitment. The same sense of self-satisfaction is evident among the Board of Trustees.  When current Chair of the Board of Trustees John Morgan took over last year as the new Chair of this board, he stated that “Emory doesn’t need to ‘change’ who we are to move into the future…Who we are is exactly who we should be.”

This attitude is short-sighted. To take merely one example, the Emory College faculty is remarkably top-heavy.  To some extent, this is a nation-wide problem, but it is especially acute here. In my own department of Psychology, out of 32 tenure-track faculty members, only two are assistant professors.  The substantial majority of our faculty members are in their 60s and 70s and will be retiring within the next decade. This trend, which is mirrored in numerous Emory college departments, is a recipe for disaster. The impending deluge of lost faculty slots, which will almost surely occur unless Emory invests massively in future faculty hires, will inevitably diminish its intellectual atmosphere, scholarly quality and reputation.

Emory’s second great enemy, risk-aversion, is the bedfellow of complacency. Over the past several decades, the Board of Trustees has been economically conservative, consistently declining to take courageous steps to boost the university’s scholarly excellence. Compare Emory ostrich-like approach to its impending retirements with that of Cornell University, which several years ago launched a massive multi-million dollar initiative to replace the faculty members anticipated to retire over the coming decade.

Of course, some Emory faculty members might see all of this as irrelevant. They may be content to teach at a high-quality and comfortable university that largely rests on its laurels and that does not expect more of them. Even so, the Emory student body should care. If Emory does not act decisively to reverse the continued stagnation and potential decline in its rankings, the value of an Emory degree may ultimately be downgraded, and along with it, the quality of our faculty and student body.

When faculty members have asked Emory administrators to explain its lack of investment in current and future faculty excellence, the latter have almost always replied with complicated – and at times convoluted – financial explanations that few of us can understand.  Enough of that.  Emory is an institution of higher learning, not a corporation. It is high time for the Emory administration and Board of Trustees to display bold leadership and to stop expecting its students and faculty to remain content with the status quo.  The Emory community deserves better.

— By Scott O. Lilienfeld, Samuel Candler Dobbs Professor of Psychology

Dear freshman,

You probably shouldn’t be here at Emory.

No, it’s not the Ebola. It’s the dazed, confused look in your eye, the wonder and bewilderment in that naive grin. You’re having too much fun. You’re making us upperclassmen feel like callous scrooges reflecting on our innocent childhood. You make us look so old and mature. You make us look like we have life figured out.

It’s not that enjoying college is a bad thing. It’s not that your sweet, uncorrupted spirits are loathsome or unwanted. You just don’t belong in college.

Let me explain.

If you don’t know why you are here, then you probably shouldn’t be here. I’m not talking about reasons like “Emory [was] a top 20 school,” or “Emory has a great visual arts department.” You need to know why you’re here, where Emory is taking you, what Emory will do to you and to your thinking and to your future. Sixty thousand dollars a year is an investment, and if you don’t know what you are investing in, you probably shouldn’t be paying it.

Wait as I count on my hands and toes — and then borrow a friend’s — all the times my professors, parents and mentors told me that college is a place where you figure things out, where not everything has to make sense right away. Your college years are also “the best years of your life,” according to most.

I think our society profoundly misunderstands the heart and soul of a college education.

Marketing, a stigma on blue-collar jobs (like the surprisingly lucrative occupation of a plumber) and an unfortunate buildup of social momentum have all combined to make college another high school, another chunk of schooling that you have to complete just to look your peers in the eye. You don’t take a gap year between eighth and ninth grade. That’s just absurd, unheard of. The notion of a gap year between high school and college now faces a similar fate.

In my senior year of high school, I told my teachers I didn’t know what I wanted to do with my life. Sure I had some vague ideas, but nothing certain. Their responses were unanimous: go to college. “But wait, I don’t even know if I can afford — ” I’m cut off. “Go, go to college.” So I did. I took over five thousand dollars of loans and put my family through a financial wormhole to make it happen, but I went to college. Has it been worth it? It’s hard to tell. I wanted to take a gap year so I could investigate the possibilities. But the advice I kept finding myself faced with was the same: go to college, study what you love, and if you love what you study enough, you’ll be able to make money. I wasn’t seeing where that lined up with the financial investment I was having to make. But surely, seeing as my advisers were older and wiser, they knew what to do. So I went to college without coming up for air.

This past summer I heard a rising freshman tell me, “I’m a physics major. I don’t know, I just took a class in high school and really thought it was neat.” On top of everything, she was her high school’s valedictorian. Nevertheless, she was looking at everything the wrong way around. It doesn’t make an ounce of sense to study something for four years and then try to find a job that roughly corresponds. If it’s a career you’re talking about, that’s 40 years of work determined by four years of study. That kind of thinking is completely backward; the four years of study should be determined by the 40-year career. Otherwise, you’re risking a miserable life after college, even if your college years are in fact “the best of your life.”

Because we are so hurriedly ushered into college, we wind up building a staircase without even thinking about where it’s headed. When you finally do find out, you might have to tear it down and start all over.


There are two essential approaches to college: the foundational and the facilitative. The foundational approach is the traditional liberal arts education which has almost gone extinct. Sure, hundreds of colleges offer so-called liberal arts degrees, but they have sorely deviated from the purpose of the liberal arts: to raise educated, critically minded, well-versed citizens who can think, initiate and appreciate. The tenure system, the commercialization of education and a decreasingly critically-minded society have all contributed to the liberal arts’ downfall. If you want, you can get a liberal arts degree just by spouting your feelings to your professors who are too scared to lose their jobs by pissing off the $240,000 that is their student.

The second approach to college, the facilitative, is more utilitarian: I want x job, I need y degree. This is also known as the pre-professional track. It involves knowing where you want to go and using education to get there. Although adherents to this approach would certainly benefit from a traditional liberal arts education, it is by no means a bad system. It is vital that students with this approach have a strong sense of the field their education is leading them toward. Too many medical school graduates realize that stress, blood and sickness is not for them when they start their residency.

To summarize, let me reiterate: if you don’t know why you’re here and what you’re doing and where you’re going, you probably shouldn’t be here. Instead, you could take the time to figure out how the next 40 years of your professional career will be best spent before diving into it blindfolded. Or you could work a nine-to-five job for minimum wage to incite some motivation to find something better. You can also always join the army.​

- By Jon Warkentine


In late May, Newsweek‘s cover featured the title “Sex, Slavery, and a Slippery Truth” alongside a photo of Somaly Mam, one of the most recognizable anti-trafficking activists in the world. The exposé, entitled “Somaly Mam: The Holy Saint (and Sinner) of Sex Trafficking,” was largely responsible for Mam’s subsequent fall from grace, though questions surrounding her credibility were raised years before. Specifically, it accused Mam of fabricating key sections of her backstory as well as persuading young girls to give false testimony. In The Road of Lost Innocence, her 2005 biography, Mam recounted her experiences of being forced into marriage with an abusive soldier and later being forced to work in a dark and filthy brothel. Yet smaller details, such as her age at the time of these events and the amount of time she spent in prostitution, remain cloudy and inconsistent. Furthermore, Mam came under fire for urging young women she rescued to tell lies, spinning ghastly tales such as the one of Long Pross, who claimed to have had her eye gouged out by an enraged brothel owner. Medical records later revealed that the eye had been removed due to a nonmalignant tumor.

As the story unraveled, Mam resigned as the head of her namesake foundation, the Somaly Mam Foundation, an organization whose mission is to “[eradicate] the trafficking and sexual exploitation of women and girls in Southeast Asia, and [empower] survivors as part of the solution.” Mam, a former activist superstar with an impressive list of supporters ranging from Queen Sofia of Spain to Hillary Clinton to New York Times columnist Nicholas Kristof, was branded a fraud.

When the story first broke, I was defensive, disbelieving and defiant. Somaly Mam had been my hero for so many years; in my eyes, she could do no wrong. She rescued hundreds of girls from lives in brothels and gave them the opportunities to rebuild their lives. Did her fudging of a few facts here and there negate her life’s work? In the grand scheme of things, did it really matter that she was dishonest?

Slowly and begrudgingly I realized: it did matter. It does matter. Intention is not the bottom line, and the ends don’t always justify the means. The bottom line is that activism has to be rooted in honesty. We can’t afford to lose sight of the reason we are in this fight. If the ultimate goal is to give a voice to the voiceless, how is it fair that we replace their stories with ones we invent? In the struggle to make change,we must tell the truth, even when it’s ugly or unpopular or not as sensational as the fictionalized version. The truth is that there are so many real, gritty stories of struggle and triumph in the world that to fabricate them is a betrayal. It means that we have failed to acknowledge authentic stories of suffering and instead traded them in for more easily accessible but cheaper replacements.

But the most important question still remains, as contributing writer Cindy Brandt asks on her Huffington Post blog, “How much blame do we share in the Somaly Mam scandal, for being the crowd thirsty for the most heartrending tale?” What surrounds us is a media-saturated society with dozens of different things vying for our attention at any given time. It takes a lot to stand out, and it takes a lot to surprise us. It takes even more to make us act. Perhaps this is why Mam felt the need to produce over-dramatized versions of suffering. Otherwise, she feared her story and the stories of countless others would be brushed aside as insignificant or uninteresting. To be clear, this is not an attempt to excuse her actions. This is an acknowledgment that we too are responsible for being aware of injustices both far away and close to home. These injustices can be glaringly obvious or unexpectedly subtle, but we must seek them out and give them the attention they are due if we ever want peace and equality to be a reality.

The story of the modern struggle for human rights is far too compelling to be sensationalized and embellished. Activism is grounded in the everyday heroes who work for change bravely and diligently and piece by piece. These may not be attention-grabbing or headline-making stories, but they should be, and they can be if only we put more weight on stories that truly speak to us instead of ones that horrify just enough to guilt us into action.

Our motivation for achieving social change should not be based on shock value, but rather on the steady fact that all life has equal worth and we should all be accountable for each other. Let this be what guides us and inspires us to act. It’s a much more reliable source of motivation, one that can sustain us for far longer. ​

– By Samantha Keng


The first thing I did when I woke up this morning was use my iPhone. I grabbed it so that I could turn off my alarm but, once it was in my hand, I took the opportunity to update myself on the world. I browsed my Facebook and checked my emails, clearing my phone of all the notifications that had come in overnight. I read the news, looked at some gifs on Tumblr and liked a photo on Instagram — all before getting out of bed.

For most Millennials — according to the Pew Research Center, a demographic born between the years 1981 and 1996 — and many others, this is a normal routine. But it’s also what many critics cite as the youngest generation’s most prominent failing: an impulsive reliance on technology that makes them “minimally employable,” as Jennifer Graham put it in a column for The Boston Globe.  She paints a picture of ambitionless “trophy kids” who would rather hide away in their rooms, playing with expensive gadgets, than go out and find a job. They want success without all the hard work that comes with it.

But, even when Millennials do manage to get hired, they’re the focus of increasingly negative reviews from their employers. A chronic complaint of employers is Millennials’ propensity for over-sharing, especially on workplace-oriented social networks. Andrew McAfee, a program director at MIT’s Sloan School of Management, writes on the Harvard Business Review blog that “one of the knocks against Generation Y is that they’ve been encouraged to believe that everything they say and think is interesting, and should be aired and shared. This is simply not true for anyone, no matter what reality TV producers would have us believe.”

Millennials are difficult to manage. Millennials aren’t passionate about their work and will jump ship as soon as they’re offered higher pay or flashier perks. And, if a Millennial does stick around, they will have a screen to their face at all times. A simple Google search of “Millennials in the workplace” will yield a vast excess of blog posts propagating these stereotypes.

Articles and videos offer advice to managers on how to handle their troublesome new employees, some more constructive than others. A September 2013 article by Entrepreneur leads with the questionable headline “6 Tips for Managing Millennials (Whether You Find Them to be Entitled or Not)” but the article — to its credit — gives advice that demonstrates a basic understanding of what makes Millennials tick. Tips include creating “micro-moments for mentorship” and providing “purpose, not perks.” It even encourages readers to “see beyond the stereotypes.”

This final piece of advice will be the key to a peaceful and profitable relationship between Millennials and the generations that came before them. Each generation has specialized skills that are a product of the world it grew up in, and all of these skills has a place in advancing the world we all live in. In the case of Millennials, their worst fault could actually be their greatest asset.

The term “digital native” describes someone born after the Digital Revolution, which began in the late 1970s and played itself out over the last few decades of the 20th Century. According to Techopedia, “the term digital native doesn’t refer to a particular generation. Instead, it is a catch-all category for children who have grown up using technology like the Internet, computers and mobile devices.”

To be clear, Millennials are not the only digital natives, and not all Millenials are digital natives. But many are, and it happens that these smartphone-toting youngsters are the same ones being criticized by their employers.

As computer technology advances, so too will its potential applications in the workplace. But these advances will push into uncharted territory, and a successful business will require the deft integration of modern technology and long-held business practices so as not to go astray. It may be true that Millennials lack the skills that would make them great leaders right now, but these skills come with experience, which takes time – an area in which previous generations have a clear advantage. However it would be foolish for previous generations to look down on the one major skill set that many Millennials bring to the table, one that will help these businesses navigate into the future: an affinity for technology.

From troubleshooting computer problems to designing web pages, Millennials’ technology skills can be employed across a wide range of tasks. And if past trends are any indicator of how technology will continue to develop, it will soon be essential for a business to have a staff well versed in the subtleties of computers and the Internet. It will be even more essential that this staff can quickly learn to use whatever new program or hardware is thrown its way. Fortunately for everyone, this is a skill that Millennials were born with.

But, unfortunately for Millennials, it isn’t easy to convince an old dog that it needs to learn a new trick. New technology has great potential, but old-school employers are often content to continue using the same tools they started with. Millennials, on the other hand, have a discerning eye for situations when technology might eliminate work and improve efficiency. Employers should seize the opportunity to take advantage of digital natives’ skills but, ultimately, it is the Millennial’s responsibility to step up and demonstrate when these skills can be applied.

More importantly, for the sake of their generation, it is a Millenial’s responsibility to demonstrate that a smartphone can be used for much more than just taking #selfies.

– By ​Nick Bradley


Over Labor Day weekend, nude photos of celebrities including Jennifer Lawrence and Kate Upton, were brought to the attention of Internet gossip fiend Perez Hilton, who then posted the images on his website. Along with Hilton’s blog, these pictures were also shared on anonymous online forums like 4chan and Reddit, according to The New York Times.

Soon after, however, Hilton removed the photos from his website and posted an apology video that expressed his sincere guilt and deep remorse for violating the privacy of these celebrities. But while the video might have appeased Hilton’s own feelings of regret and responsibility for spreading the images, it did nothing against the permanent power of the Internet, where once something is posted, it is nearly impossible to purge its existence.

While it is easy to point the finger at the perpetrators who hacked into these celebrities’ iCloud accounts and obtained the pictures, the real issue that comes into light is our generation’s overestimation of Internet security. Just because something takes place behind a computer screen, or in this situation a cell phone, does not ensure its guaranteed privacy. Just because something is “deleted” does not mean it is gone forever.

As computer-users, we are under the impression that when something is gone from our screens, then it has been thrown out into virtual trash and taken out to the virtual garbage dump, where it disappears.

This is not the case, and the leaking of nude celebrity photos highlights this misconception of Internet security.

Google, for example, stores everything you store. E-mails, web searches, pictures, contacts, calendar events. Anything you’ve ever clicked, liked, searched, posted, commented or typed is accessible, according to CNN.

But Google itself is not the problem (what can they really do with my most recent web search of the nearest Chipotle?). It becomes a problem when hackers access this information. And still, one might ask, what can a hacker do with my most recent web search of the nearest Chipotle? Based on that search, they can find out where I am in no time at the simple click of a button.

Hacking into a computer might even be easier than breaking into a house, and yet we are not cautious in what we post and where we post it. It is easy to underestimate the power of a hacker because a sense of ownership and privacy is assumed every time we open our devices; this is my phone, this is my computer, so my actions are confined only to these screens.

But these actions we expect to exist under our eyes and our screens only can easily be shared with millions of other eyes and millions of other screens. We want to believe that our security is invincible, that our accounts are impenetrable. But the system is not foolproof. There are loopholes and ways to breach the screens, and hackers dedicate their lives to exploiting these faults.

Since we grew up during the social media boom where everything is shared with everyone, it is only natural for us to disregard the security of all our technology. We use usernames, we install antivirus software, we create passwords and expect our devices to be protected.

Yet, there are still instances time and time again, as most recently presented in the nude celebrity photographs that were leaked, that highlight the weaknesses of a network into which we place so much trust.

Among these web blunders, it is easy to forget the purpose of the Internet and what it has done for the world. This global connection of networks allows us to access endless amounts of information, communicate with people around the world and further advance technological progress, to name a few. But this powerful tool can also cause harm.

While some may blame Jennifer Lawrence and Kate Upton for being foolish and careless of their cyberspacial possessions, they alone are not guilty. Instead, maybe our generational mindset is to blame for the fact that we live in such a technology-dependent world. We must always be mindful of such actions carried out on the web and be cognizant of their potential consequences.

-By Zoe Elfenbein


The publicity surrounding Silicon Valley’s newest spawn, the iPhone 6, seems to be increasing exponentially.

News outlets from USA Today to Forbes magazine have praised the ingenuity of the sixth edition of Steve Job’s most successful invention. Other news coverage extol the smartphone’s aesthetic appeal. Consumer Technology Editor Matt Warman from the Daily Telegraph writes “the new iPhone 6 is ‘the most beautiful phone ever made.”  However, in the midst of the seemingly endless glorification of a piece of polycarbonate, the world should take a step back and look beyond the iPhone’s glistening eight-megapixel lens. Sometimes, if one were to listen closely enough, one might be able to hear screaming coming from the mouths of millions of broken souls.

Unbeknownst to the majority of the Western world, the price of the iPhone is in fact dirt cheap compared to the sacrifices made by ordinary people in order to create the product. Each day, people in Asian countries channel every single ounce of their energy into putting together these so-called handheld miracle machines.

Workers in Wuxi, China who are employed by Apple, have, in the past year, received media attention concerning the appalling hours and deplorable conditions in which they work.  Often, these workers work 12-hour days assembling iPhones. In turn, they make only about 1,500 Chinese Yuan a month, approximately 245 US dollars. Furthermore, they are forced to sleep in cramped quarters, with up to eight people sharing a room at one time.

These inhumane conditions have lead to many of the employees to contemplate and even attempt suicide. On March 17, 2010 in another Apple-based factory known as Foxconn located in Longhua, Taiwan, 17-year-old Tian Yu threw herself from the fourth floor of the workers’ dormitory she had been living in, paralyzing herself from the waist down.

Yu had been forced to assemble parts of iPads and iPhones for over 70 hours a week, including during her lunch breaks which she had to skip in order to keep on schedule. The stress of having to complete such an enormous amount of work led Yu to give up on her life completely.

In 2010, 14 of 18 workers who had attempted to commit suicide at Foxconn died. The other four, including Yu, survived. Unfortunately for Yu, that meant carrying on with her difficult life as a paraplegic.

Workers like Yu all come from fairly similar backgrounds. In 2013, Leslie Chang, a reporter from The Wall Street Journal in China, led a TED Talk entitled “What Are The Lives of Chinese Factor Workers Really Like?” In it, she discusses what the workers she had met were really thinking and how people from Western countries can be quick to see these workers as “faceless masses.” Chang speaks about one woman in particular, 18-year-old Lu Qing Ming, who explained to Chang her mentality: “a person should have some ambition while she is young, so that in old age she can look back on her life and feel that it was not lived to no purpose.”

Chang goes on to explain that 150 million workers in China, just like Ming, leave their farming villages to work in factories, restaurants and hotels of big cities, making up the largest migration in history. These workers, many of them young women, do this in order to achieve more in life than their parents had. Ming had a dream to live a life other than that of a farmer, a dream that was not unlike any typical American adolescent. She too wanted to have the privilege of buying an Apple product and live in her own apartment. However, her story and her path are much more difficult than anyone who does not share the same situation could ever fathom.

Behind the facade that Apple and other major corporations want us to see, there are millions of people with stories and souls, waiting for their moments to break free. Corporations such as Foxconn and Apple are businesses that care about mass advertisement and gross income, not the individual lives of each employee. Ironically, these factories that these migrant workers see as a gateway to a better life, are actually industries that break the dreams of young spirits in exchange for, as Chang puts it, “Coach handbags on our shoulders, Nikes on our feet and iPhones in our pockets.”

While Americans may recognize that these products are convenient and useful, they must also realize that these material items have dehumanized millions of people that will never have their stories told.

– By Jesse Wang

Cartoon Segregation

I grew up in a suburban Colorado neighborhood. I was very self-conscious of my identity, the way I looked, the food I ate and the God to whom I prayed. Most of the people I knew were either white or Latino. As much as I enjoyed learning about them and their cultures, I was excited when I had the opportunity to come to Emory University. Upon acceptance, I scoured the statistics compiled by The College Board and was floored to see the ratios between the races; the statistics were starkly different from those at my local University of Colorado, Boulder (CU Boulder). CU Boulder’s student demographic consists of only 5 percent Asians, 2 percent Black or African American and 4 percent international students. Compare that to Emory’s student demographics: 22 percent Asian, 9 percent Black or African American and 15 percent international students. The diversity factor at Emory is at least three to four times greater than at my state school. (Emory URL:; Boulder URL: )  I came to Emory partially to see the diversity and to learn more about global cultures. I also saw it as a chance to interact with my fellow South Asians, whose numbers were few and far between in Colorado.

In high school, I felt affected by that notorious social hierarchy — the popular pretty girls and jock boys, the drama geeks, the studious kids, the druggies. I would have loved to get to know different people from these different social rings, but if I tried, more often than not, I was snuffed. But my teachers, whom I adored, relayed to me their wonderful accounts of college life. They told me I would have a good time, that I would meet people who shared common interests and backgrounds. I realize it’s not the same thing, but from their comments, I assumed that the social boundaries set in high school would dissipate in college: once I entered college, I thought I would not have to worry about popularity, prettiness, appearance and other superficial things that hinder us from actually getting to know each other.

When I finally arrived on Emory’s campus, I took every opportunity to meet different kinds of people. I struck up conversations with as many people as I could. To my surprise, I felt that some people weren’t responding quite so warmly. Their cooler responses reminded me of high school all over again. Over time, it did not take long for me to realize that students at Emory have their own social hierarchy — and it was worse than in high school because it appeared to reinforce racial divisions. I could perceive this to be nothing else but self-enforced segregation.

The “Emory bubble” is how many students describe their life here, and indeed, it is quite distinct from the real world. We’re relatively comfortable in our cute, little campus. Yet I’ve realized that not only do we live in an Emory bubble, but within itself, our Emory bubble appears to contain even more bubbles! When I go to the DUC, I see a few tables with people of multiple races. But only a few. I see more tables with only people of one race. As I walk through campus, I see groups of friends walking together — they are so often of the same race.

The Emory bubbles are not just between people of different races, but also between different cliques reminiscent of high school. Generally, it appears that people involved with Greek life keep to themselves. Furthermore, I even sense that there are even socioeconomic-based cliques. Thus, our students segregate themselves between races, socioeconomic classes and Greek life. For a reason I haven’t been able to pinpoint, these boundaries feel even harder to shake than in high school. In fact, I might even say that my predominantly white high school was more likely to embrace different cultures and people than my extraordinarily diverse college.

I realize how strong the word “segregation” is. It is heavy with racial connotations and striking reminders of our violent American history.  But segregation is what I see. I am not saying that it is wrong to spend time with people who are similar to ourselves. It is easy to get comfortable, but how will we develop into well-rounded individuals if we don’t expose ourselves to the unknown? Certainly, it is important that we learn more about our own histories and cultures. Nonetheless, we should also value other cultures and people.

My high school teachers also told me about the transformative conversations they had in college, which changed their minds and helped them develop more sophisticated opinions and realize new perspectives. We are lucky to be able to study in a school whose students are so global and representative of our entire world. Many people talk about their study abroad trips with great pride, and surely their experiences have been life-changing. But in a sense, our school has brought the opportunity to study abroad right here on campus. Our high numbers of international students and our diverse student demographic give us an opportunity to learn a bit about other cultures. Emory has numerous multicultural programs to encourage social mingling and the like. I must qualify that there are a great number of Emory students who overcome boundaries of race, Greek life and socioeconomic class. I commend these people and think we should do likewise.

You might say that Emory is not the only one with segregated cliques and communities. And yes, I can imagine that this same phenomenon occurs at many other universities. But we can choose to be better than that. Emory provides us with many opportunities to interact with one another in diverse environment. We can choose to be more worldly and open-minded students. We can choose to integrate ourselves to become a more whole, complete school.

The world is globalizing, with people all around the world participating in fields like business, education, science, sociality and more. Why are we holding ourselves back from progress? We should open up and engage in conversation with people who are different from us. It will enlighten us, broaden our perspectives and teach us new things, whether it is how to use chopsticks, how to cook a traditional Southern dinner or how to gracefully eat injera with a bowl of shiro.

-By Aarti Dureja

Follow Us