As a sophomore, I am coming to the time when Emory requires me and the rest of my graduating class to declare a major. On the surface, it seems that this decision will be one of the defining moments of our lives. We are essentially choosing a path to follow and specializing in a discipline.
However, this view isn’t necessarily correct. According to Forbes, job hopping — or voluntarily switching companies or jobs after a short period — is the new normal for Millennials (people born between 1977 and 1997). On average, a person of this generation will stay at a single job a little over four years, meaning that there could be upwards of 15 job changes in our futures. All of this career-switching means that it would be impossible to major in every subject area needed to successfully complete every job.
This is where the importance of a liberal arts education begins. Initially begun as a way to provide students with a well-rounded education that included social sciences, humanities, physical sciences and modern and ancient languages, the idea of a liberal arts education has expanded to include teaching writing and logic through various assignments and class discussions. Emory, with its general education requirements, sets a minimum level of a liberal arts education required for graduation from Emory College of Arts and Sciences. It is up to the student, however, to expand their education in areas outside of their major.
At times, this can be difficult. The demands of many majors in terms of class requirements and difficulty can make it almost impossible to balance taking required courses in addition to ones outside of the major. For people who are double majors, or considering it, like myself, this task can seem daunting. There is a feeling that by taking a class outside of the major, one is missing out on the chance to take a class in the major and thus fulfill the requirements needed to graduate with a specialized degree. This feeling can be coupled with a sense of missing a chance to expand one’s knowledge of the chosen major or majors, especially when deciding between taking a variety of classes in the department or other classes outside of it. Feeling this way may prevent academic exploration, and this can be a possible detriment to the student. Often overlooked, academic breadth is sometimes more important than depth.
It is true that some careers, like those in the medical field, require a depth of knowledge in multiple particular areas, but there are even more careers and paths that require a wide range of knowledge bases. Take someone who works in advertising, for example. A knowledge of anthropology would be important so as to understand the culture of the audience for their ads. A knowledge of psychology would be important so that they can understand the emotional affect their choices have on the viewer, and a knowledge of another language would be helpful so that the implication of what is being said can be translated accurately. The list could continue on to include a multitude of subjects outside those which one studied in college.
While many may make the decision to double major to increase their marketability to employers or graduate schools, in reality, this may not be the best path to choose. Other double majors may be concentrating in two areas simply as a result of a passion for both subjects, a situation that requires more contemplation on the part of the individual and, if I had to guess, this idea accounts for the majority of double major declarations. Even these two situations don’t explain every decision to double major. For those double majoring solely to increase marketability, a double major student who studies in two seemingly opposite fields may not be learning as wide a variety of subjects as their single major counterpart.
This is especially true if the single major is exploring classes in all departments of the college. Many employers claim they are looking for a well-rounded candidate for a job. If this is true, then they will want to see an understanding of subjects that span a great range. Oftentimes getting a job is more about professional experience in that area or in general than it is about your major. This gives us even more incentive to seek out opportunities to build our professional lives in a variety of areas to gain experience in many job sectors and therefore qualify for one or more jobs. Experience is important to the job application process, so it may be more beneficial to seek out professional experience, rather than attempting to increase your job marketability through a double major.
Declaring a major is a way of indicating a focus on a particular subject; it is not saying that a person with a certain major isn’t capable of and doesn’t have an extensive knowledge of other subjects as well. In no way does my declaration of a History major indicate I am unable to understand biology or chemistry; it is merely an indication of the academic subject I chose to dedicate time to.
For those considering a double major, I would suggest identifying exactly why it is essential that both subjects be majors, instead of choosing a minor in one of the areas. Sometimes, the push to sound outstanding on paper prevents us from being outstanding in real life.
I am sure to many of our future employers, and to other students like myself, a wide variety of classes and mastery of subjects is far more impressive than a specialization in one. It is important to show potential employers, professional schools and graduate schools that you are capable of understanding more than a single subject or area of subjects. This not only makes you more marketable, but it also makes you more interesting and better prepared for any potential employment that comes your way.
While double majoring is still a great choice for some, and I am still heavily considering it, it’s also important to take advantage of the liberal arts education that is available here at Emory. There are many great reasons for double majoring and for some people it is the choice that best fits their goals and desires. I would not be considering a double major if I did not see the benefits of having two areas in which I am well-educated.
Double majoring is, and will continue to be, an individual choice that depends on a variety of factors. I would encourage everyone, double major or not, to look at their academic choices and assess if they are truly preparing them for their desired future. The post-college years are intimidating, and a sense of security can be comforting and can compel you to make choices only for their potential career gain. Double majoring is a choice that should not be made solely for the potential post-collegiate benefits and security some think it provides. The choice to double major, like all academic choices, should be one that is carefully considered on an individual basis in terms of your personal plans and goals for the future.
— By Alli Buettner, a College sophomore from St. Louis, Missouri.
Before I begin, I would like to give a general disclaimer. Although this article was called to the forefront by the incident regarding the degrading, racist comments towards the Indian-American population (in which members of the AEPi intramural flag football team allegedly told their opponents to “go back to India” among other racially offensive comments), the general message of it is concerning the level of racial tolerance both on campus and within society in general. I am, by no means, belittling what occurred on Monday evening, but I also feel the need to address the many acts of racism that take place on a daily basis in more private settings that do not get as much coverage and support as this recent incident did.
It is important to note that every religious, cultural and ethnic group gets its share of hatred; no one group is immune to the bitterness of the spoken word, and I believe I am speaking for the general student body when I say that each and every one of us knows what it feels like to be discriminated against or looked down upon.
Growing up as an Indian in the United States (as an Indian-American), I have encountered racism both directly and indirectly. The insults ranged from snarky comments about curry and religious traditions to full-on complaints questioning why Indian-Americans are so “white-washed.” Over the years, I learned to desensitize myself to the verbal trash that was being thrown at me. There were moments where people would ask me if I had a husband waiting for me overseas or why I never came to school with a “dot on my forehead.” Time after time, I found myself simply unable to find a way to respond to the stereotype-heavy, ignorant and downright inappropriate questions that were seemingly impossible to avoid. Still, regardless of how hard I tried, the impressions remained. At times, I found myself questioning my racial and ethnic identities, confused as to how I was expected to still “act” and “be” Indian by the general public when I grew up around American food, music, television, customs, friends and classmates.
This, unfortunately, isn’t a singular, unusual situation. I personally have experienced and witnessed hundreds of similar scenarios, with the targets of racism being of all races, religions and ethnicities. Discrimination is an ongoing problem whose occurrence, in a country as diverse as this one, is painfully common. Although it is a practice that is banned or looked down upon legally, public acts of racism are widespread and growing.
Racism and stereotyping — which, although viewed as less extreme than the former, can be equally hurtful — are and have been a commonality of human society for centuries. Sociologist Henry Tajfel defines our tendencies to group people together through a series of processes called “ingroup” and “outgroup” attributions (essentially looking at things based on groups of people we identify with and groups of people we do not identify with). As humans, it is natural for us to see things as “me vs. them,” to generally attempt to understand what we are and what we are not through other people. Still, there remains a fine line between what is natural (or considered natural) and what is socially acceptable — a line that common sense draws in regards to what should and should not be said.
The response by the Emory community in regards to Monday’s racism occurrence was dramatic and relatively surprising; racism is something many of us have become blind to, and the fact that one incident during an intermural sports game blew up into a full-scale rally against discrimination makes me proud to be an Emory student.
Nonetheless, I feel like these community-level realizations that we do not live in a fully-equal, hate-free world come too sporadically.
Yes, we will speak out when a certain racial group gets targeted publicly. But will we do the same when we hear hurtful comments on the elevator, over lunch or in the crowd during a heated soccer game? Or will we simply hope that the words we just listened to will go unheard by those whom they were intended to harm? Yes, most of us have an active, working moral compass. But how many of us will put it to use at risk of being the center of the confrontation?
We highlight racism as a stark problem one day and completely forget about it the next. There needs to be a larger movement to speak out against the smaller happenings, the uninformed stereotyping and the racist humor that appears to be such a commonality within our generation. Why does it take one person shouting in front of a crowd to spark an outrage when this same situation repeats hourly within smaller groups?
If we are living in a society in which “racism is in the past,” why does it still live amongst us?
We may have advanced significantly from where this nation was a 100, or even 50, years ago regarding discrimination, stereotyping or racism. But have we advanced enough? Is this where we want society to stand in how we tolerate each other and our differences? Is this the world we want to pass on to the generations beyond us, a world in which we turn the other cheek depending on whether or not we feel confident enough to speak out?
My argument for a new range of thinking stands within how I would answer these questions; we, as the American population, are midway within the journey towards complete racial equality that began somewhere around the end of the Civil War in 1865. Because it took almost 150 years to get this far, it would be unreasonable to call for immediate change; this will be a long process, a tedious series of efforts and social modifications to ultimately result in full and complete racial acceptance. We must learn to react to the smaller incidents — to speak against racism over that lunch at the DUC, that elevator ride at the library and that heated soccer game.
There is a great deal of room for change and for acceptance. As we progress into a world where there will no longer be a “majority” and a “minority” within this country, we must also adopt changes to our mindsets regarding what it means to be accepting and to be human, rather than specific racial or cultural groups. The general outcry against what happened on Monday is a large step in the direction for change as was the campus-wide support for AEPi’s anti-Semitism incident earlier this month. By standing together and continuing to speak out, we, as a generation, can make strides in resisting prejudice and fostering a color-blind society.
All in all, change is mandatory.
Because there has to be something wrong with a social system that otherwise quietly ignores daily occurrences of racism — a system that ends up leaving an Indian-American girl wondering how she could be seen as “white-washed” when she knows nothing more than the American culture she grew up with her entire life.
— By Sunidhi Ramesh, a College freshman from Johns Creek, Georgia
After seeing countless statuses and links on my newsfeed expressing extreme concern about Ebola having come to the U.S. — the paranoid responses of those who are afraid to leave their homes, who want to seal our borders, who criticize our medical workers and public health leaders (I won’t even mention the ridiculous conspiracy theories I’ve heard concerning the President) — I can’t hold in my opinion any longer. People need to stop panicking and start acting.
There have been three cases of Ebola in the U.S. One of these cases, which tragically resulted in death, involved a Liberian man named Thomas Eric Duncan, who contracted the disease in West Africa (If you’d like to discuss how outrageous it is that he was denied treatment, despite his 103-degree fever, possibly based on his skin color and social standing, I have a few opinions on that too).
The two other cases, both health care workers who treated Duncan, are the only two Americans known to have contracted the disease on American soil. Blinded by fear, many American people are forgetting the fact that for many West Africans, facing their relatives and friends dying all around them is a daily reality. I’ve been surprised at the quantity of media addressing the U.S. Ebola “crisis” as opposed to media displaying the utter horror of what is going on in West Africa. This media approach is a massive factor in fueling the American people’s unrest.
The hysteria that our country is falling victim to originates from a fear of our proximity to two of our own citizens seeking treatment here in Atlanta, despite their being under strict regulation and in containment.
There have been nearly 9,000 cases reported in the region of West Africa, over 4,000 of which have resulted in death. And these are just the reported cases. Yes, these numbers are very scary. I understand that. But our best chance of stopping this disease is by fighting and containing it at its source. We, in the U.S., have unique resources and expertise to do this. Among these resources are American doctors and scientists who are doing everything in their power to limit this disease to its area of origin.
My mother is one of these. She is a doctor who works for the Centers for Disease Control and Prevention (CDC), and she is currently in Sierra Leone for a month, having volunteered to treat patients suffering from Ebola along with interviewing them to help the CDC track and contain this terrible disease. I’m not expecting that to hit home as much as it does for me, but maybe it’ll add some perspective for some of you. The horror of what she is seeing and living every day is what we need to focus on. If you are able, channel your fear into incentive to support organizations aiding in the control of Ebola.
Consider donating to organizations like the United Nations Children’s Fund (UNICEF), Doctors Without Borders, the American Red Cross, etc., all of which are using these donations to send supplies, medicine and health care workers to West Africa.
Yes, Ebola has come to America, and we may well see more cases in the coming weeks. But let’s stop this talk about closing our borders. As American citizens, we have the right to return to our country, suffering from Ebola or not. And as a country founded by immigrants (which of us, after all, is not a descendant of people who came from elsewhere?), shouldn’t we open our arms to anyone in need, rather than engaging in mindless xenophobia?
Our doctors and scientists, moreover, learn how to combat this disease by actually treating cases, ultimately making us all safer in the long run. I obviously hope my mother does not contract Ebola while doing her work. But, hypothetically, if she did, would you seriously deny her, an American woman who has risked her life protecting you, the right to seek aid in the country she calls home? I understand that some Americans are scared, but we should not allow panic to dictate policy.
This disease can only be transmitted in very limited circumstances involving direct contact with bodily fluids, not through air. In donating money, you are helping others while simultaneously looking out for your own well-being. The more resources that find their way to West Africa for combating Ebola, the less likely it is to become a serious problem in our country. And if you are unable to donate, remember that knowledge on this subject and spreading it to promote awareness is also of the upmost importance.
– Anna Bing is a College freshman from Atlanta, Georgia.
You’re a baby. You drink too much. You’re an alcoholic. Don’t you have homework to be doing? You party too much. You haven’t had enough real world experience. I need to hang out with more people my own age. I’m so glad I worked for a while. The work I’ve done inspired me to come here.
I’m 22. I finished my undergrad in May, and I started grad school in August, and these are the kinds of things my colleagues say to me on a regular basis. From the people I’ve interacted with, both in person and online, I’m the youngest graduate student I know. It’s not like I graduated early or am some sort of genius. Graduating college both cum laude and with newfound friends taught me that I know how to manage a lively social life and still get killer grades. What people in grad school don’t seem to understand is that I can probably still do that. When I started my undergraduate career, I wasn’t really sure what I wanted to study or what my future held. When I made the decision to go to graduate school, I knew exactly what I wanted to study and where I hoped to see myself in five years. And that place I see 27-year-old me is only possible with a continued education.
According to Rollins School of Public Health’s website, the average age of students is 26, with a range from 20 to 60. I’m slightly younger than the average student, but not quite the youngest student on campus. Sixty percent of people starting at Rollins took time off before returning to school. There are Facebook groups and get-togethers for students not straight out of undergrad. One such Facebook group is labeled, “Non-Traditional Students with Four or More Years of Experience;” however, the previous data would suggest that most students at Rollins are those with four or more years of experience, suggesting that there’s nothing non-traditional about them. Returning to school is something I can’t relate to directly. I realize, though, that the challenges in returning to academia must be numerous. But what I find with more and more time spent researching this is that those of us under 24 are the ones without a group.
We are the ones out in the cold looking to fit in with our peers. We’re constantly told we need to change, because “this is grad school now,” and we “can’t just do the same stuff [we] did in undergrad.” We should’ve gone through some kind of transformative period that suddenly made us deserving of grad school. Instead, we are ostracized. Last time I checked Rollins’ program, it required a personal statement, several letters of recommendations, some good grades and the GRE. No mention of an age minimum, and no mention of prior experience required.
I applied to graduate school, jobs and various internships in my senior year of college. What I realized is that I wasn’t qualified for the jobs I wanted because I didn’t have the minimum level of education. The competitive internships I applied to didn’t seem to see in me what I saw in myself.
However, grad programs saw in me the drive to learn more and better prepare myself for my dream career. My resolve to stay in school and learn more before taking on the job market was reignited by such an excellent program believing in me.
Somehow though, my age seems to invalidate my right to be here. Just like everyone else, I applied and was accepted. I didn’t apply to a ton of schools, I didn’t have a killer GRE score, and I didn’t have endless years of experience to draw upon outside of school. Instead they saw an all-consuming passion to do what I love, to expand my knowledge beyond my Bachelor’s degree in Public Health and to start a career after obtaining a firm academic base. The admissions department understood that I didn’t need to find myself before coming, because I’ve already found myself, and my dreams start here. We’re all here now. We all made it.
The fact that I’m 22 is irrelevant to my right to be here. The presumptions need to stop. We have the opportunity to view each person’s attributes as unique to them and as a way to make our community stronger. This isn’t what’s happening in grad school. Instead my experience thus far has been overshadowed by my relative youth, and my perspective has not been respected. We all have something to contribute regardless of if we’re coming right out of undergrad or if we’ve been out of school for years.
— By Taylor Chambers, a first year student at the Rollins School of Public Health from Madison, Wisconsin.
When asking the basic questions to get to know a person, we ask for someone’s favorite color, their family, their pets. Often, the last two can be complexly intertwined, as the status of pets, namely dogs and cats, has risen to accepted family members in the opinion of many. Pets have been by our sides for millennia, but the past few decades have had a shift in social attitudes towards our animal companions. Pets now have more rights and protections than any other non-human animal in our country, especially when regarding the court cases that have recently elevated them to quasi citizens which is a status normally reserved for long term residents of a foreign nation.
Animals started developing into a quasi-citizenship position as early as 1822 when the British enacted the Martins Acts, thus protecting domesticated animals from acts of cruelty. These legal boundaries have been pushed more often as people’s relationship with their pets progress.
Consider, for example, the case in 2007 that gave a golden retriever a court-appointed lawyer to look after her best interests in a custody suit. That same year, a woman was given a court mandated restraining order from the ex-husband’s dog, since pets are often attacked in situations of domestic violence. Fast forward to this year where a police officer was fired after fatally shooting a family dog due to public outcry.
Even looking away from the legality of pet rights, the economy for the pet industry has never been higher. Last year $55 billion was spent on people’s animal companions. More than a few cats have received liver transplants, which is not a cheap expenditure. Instead of spending enormous amounts of time pondering over why this shift has taken place, we must discuss what this means for the future.
Personhood is the state of being an individual or having human feelings. When talking about personhood rights in the eyes of the law, this means granting additional legal protection for animals while toughening animal cruelty laws. On the surface there are tweaks such as changing “owners” to “guardians,” but at the core personhood requires a right to bodily integrity if the animal in question has practical autonomy (the ability to desire, fulfill those desires and have a sense of self-awareness).
Most people, no matter how much they love their furry friends, can’t seem to fathom the thought of them having personhood rights. And why is that? The two biggest issues forming the base of the opposition are self-interest and the concept of non-people having people rights.
Taking a look at the first point, giving personhood to pets can be seen as a slippery slope. There is a great fear that if the neighbor’s dog gets rights, then what’s stopping those rights from extending to that tasty fish you eat at Red Lobster or KFC goodness? Let it be known that I am no vegetarian. I think this argument has two major flaws.
One, it completely ignores the possibility of strict parameters based on an animal’s species and intelligences as there are already scientifically ranked hierarchies of animal intelligence (also known as cognitive ethnology). Animal intelligence is measured with tests that include tool use, problem solving, displays of emotions and consciousness, spatial memory and language acquisition. These tools allow us to create a clear language for discussing animal intelligence.
Secondly, the slippery slope argument is at its core a fallacy in which a person declares that some event (no more meat meals) must inevitably follow from another (specific animal personhood) without any rational argument for the inevitability of the feared outcome. This line of thought fails to account for the ways in which real world scenarios can play themselves out. I find it irresponsible to hold back on protecting intelligent lives in apprehension for lifestyle restrictions that are not in any way set out as inevitable.
If you think that pets should not gain more personhood rights because personhood is for people, I would like for us to consider corporations. The infamous legal concept of corporate personhood is a doozy. Even though corporations have had personhood since 1888, the concept was refreshed to the American population this summer in Sebelius v. Hobby Lobby Stores where the Supreme Court allowed the corporation to make religious objections to generally applicable law. Now, whether or not you support this decision or the ideas of corporate personhood, you must admit that corporations are made up of people and are not living, breathing people themselves. I am not here to debate whether corporate personhood is good or bad, but I am surprised to see people accept businesses as deserving of personhood but balk at the thought of animal personhood.
In San Diego and Orlando there is a debate heating up over SeaWorld’s treatment of orcas (a highly intelligent species) that has been going on for years. The documentary “Blackfish” in 2013 took the issue of captive orcas head on and multiple awareness groups have arisen to challenge SeaWorld to face the public over the controversies detailing the physical and psychological damage that orcas experience in captivity.
SeaWorld’s only public response to the film has been a website titled “Truth about Blackfish” while calling the documentary propaganda pushed by animal-rights extremists.
As a corporation, SeaWorld has more rights than the animals in its care despite the mounting evidence that captivity is shortening the “Shamus’” lifespans by half a decade among other major health issues.
I only wonder what amount of animal rights would end debates such as this once in for all. It is slowly but surely becoming probable that the future will hold positive changes in the case of pet personhood.
As more neurological research and ethical debates build the background of legal cases regarding animal rights, it is difficult to refute the shift in attitudes towards what personhood really is and who (or better what) deserve it.
Erin Degler is a College junior from Orlando, Florida.
In a recent article featured in Politico, Erwin Chemerinsky, the Dean of UC Irvine School of Law, wrote an article concerning Justice Ruth Bader Ginsburg’s decision to remain on the Court despite mounting pressure from liberals for her to retire. He writes, “If Ginsburg – nominated in 1993 by President Bill Clinton – had stepped down this past summer, President Obama could have had virtually anyone he wanted confirmed.” Chemerinsky goes on to say that while Ginsburg is a paragon of the Supreme Court’s liberal wing, she is too old to stay on for long. He argues that Ginsburg, who is 81, needs to retire “yesterday” so that Obama can nominate an equally liberal successor before the November elections, when Senate Democrats are supposedly set to lose their majority and thus would remain paralyzed to confirm a successor thereafter.
Chemerinsky wasn’t the first pundit to express this sentiment; however, it is part of an unnecessary discussion. Even the thought of calculating the departure of a Supreme Court Justice and the successive nomination is part of an unnecessary conversation. Ultimately, it seems as if Legislative politics has entered the Court, a grave outcome which harms the checks and balances set in place by our Constitution.
A variety of decisions from the Court have come to influence modern American politics. In the 50s, 60s and 70s, the Court issued a blizzard of decisions that affected American policy. For example, in the Roe v. Wade case, the Supreme Court in 1973 issued a majority decision in favor of a woman’s right to privacy when deciding to perform an abortion. Roe wasn’t the only decision that sparked political movements however, as many of the decisions of this era of the Court are still debated today. Yet, this case was significant in that it elevated the Supreme Court to its current status as a mediator in particularly divisive issues.
What has essentially ensued from Roe and other decisions alike is a need to pack the Court with “conservative” or “liberal” justices in order to satisfy political agendas. Decisions such as that of Roe have increased the theatre of the Supreme Court and have also caused party politicians to consider the Court as a political vehicle when making their calculations. Nowadays, even the District Courts can serve to political agendas, such as the DC and 11th District Court’s rulings on the Affordable Care Act.
Because the Court has the authority to weigh in on a variety of issues, ranging from civil rights to teeth whitening at tanning salons, it is important that we do not nominate Justices based solely on ideological climate. Doing so corrupts the system of balance the Court has against the Legislative and Executive Branches by essentially combining the two into one governing body in a manner modeling an oligarchy.
If we could replace Justices on a whim, then the Court would not have to not check against the Legislative and Executive Branches, but rather serve them. This is why the life term of a Supreme Court Justice is so important, specifically in terms of the integrity of our government, and why Ginsburg has both the authority and the right to choose when she leaves the Court.
While the life term can seem to create a Court that is often decades behind the current political faction in power, it is important to recognize that this Constitutional mechanism is contingent towards an effective government.
In an era where party politics dominate progress, it is the Supreme Court that acts as a buffer against the calculations of any party. Truly, it is one of the few institutions that can still affect change without needing to bow to politicians or outside influences. While the Court has issued many decisions with its usual drama that seem to paint victory for one side and defeat for the other, (Burwell v. Hobby Lobby, for example), its integrity as a system of governance, not a system of politics, must be maintained.
The pressure from pundits and politicians alike for Supreme Court Justices to conform to the whims of party politics is certainly nothing new. From Franklin Roosevelt to Chermerinsky, politics has played a huge role in the trend of those wishing to alter the Court in the name of a larger political agenda. However, as history says, the Court has ultimately rebuffed such challenges to its structure, and has remained an institution like no other because of this invariability. The Court still retains its central purpose by serving as a buffer against the abuse of politics gone too far – immune to outside influence and political threats, the structure of the Court is inherently an apolitical institution, and should remain that way.
Ultimately, Ruth Bader Ginsburg is a Justice with liberal leanings, but one who interprets the Constitution in her own lens, not the Democratic Party’s. Even though she is considered an “activist Justice,” her decision to continue serving on the Court is representative of the Constitutional structure of the Court at work. Justice Ginsburg, along with moderates such as Justice John Roberts and more ideological justices such as Antonin Scalia, is part of a Court that has largely been consisted of justices across different political spectrums. The fact that Justices have too often cross aisles to issue majority decisions means that the outcomes of cases from the Court reflect not one will of a majority, but rather the conglomeration of multiple perspectives. This diversity of interpretation has been the crux of many historical decisions from the Court.
The Founding Fathers intended for the Court to consist of Justices from all political spectrums across multiple Presidents, such that tyranny of the majority does not overtake the Court. The prevention of absolute power of one party in any Branch of government has been a common theme of many Constitutional mechanisms, and is why bipartisanship is so often observed. It’s a system that has been described as messy, but our government is ultimately one that can accommodate a small majority like the conservative Justices, and a vocal minority such as Ginsburg because of its structure.
As with the other Justices, Ginsburg is free to interpret the Constitution as she wishes for however long she intends to. To begin to tell her that she “doesn’t belong” on the Court any more is both impolite and inconsiderate of her background as one of the few female lawyers in her graduating class at Columbia Law School. It also reflects a misunderstanding about the sole purpose of the Court: to serve as a Constitutional check against the much more ideological (and political) Legislative and Executive Branches. Justices ultimately do not and should not have a duty to any party, President or ideology, but rather to how they see they interpret the document that has been the foundation of our government for generations. While the meaning of those few thousand words can change over time, the document still continues to exist in its full integrity thanks to the Court.
— By Somnath Das, a College sophomore from Warner Robins, Georgia.
- By Zach Elkwood
Armored vehicles lumber down the dusty road of a small town; heavily armed men sporting body armor and tactical gear ride along in the back. This is the reality of post-9/11 America.
Many military operations overseas are coming to a close, but the War on Terror is still happening. An alarming amount of ordinance from foreign wars is being funneled back into the United States and being given directly to law enforcement agencies.
This, coupled with an upswing in the predominance of paramilitary tactics utilized by local police, has permanently transformed policing in the U.S. and could alter American society as a whole.
The use of assault weapons and military-style tactics by the police are very troubling for a number of reasons. Statistics released from the U.S. Department of Justice shows that the vast majority of weapons used in violent crimes are handguns or knives, which makes the use of assault rifles by police seem like overkill.
The use of “no-knock warrants,” which allow police officers to enter a home without immediate or prior notification to the homeowners, is a tool increasingly utilized by police officers. No-knock warrants are used when it is believed that evidence in a home may be destroyed during the time it takes police to identify themselves. Warrants of this nature have been decried as violating the Fourth Amendment. On top of constitutional challenges, the warrants are controversial for other reasons. For example, burglars have broken into homes by claiming to be police with no-knock warrants. Armed homeowners who believe they are being invaded have exchanged gunfire with officers, leading to deaths on both sides. The use of no-knock warrants has grown from about 3,000 raids a year in the 1980s to about 70,000 raids a year.
But what was the catalyst behind this trend in American policing? The proliferation of heavily armed police is directly correlated to America’s “War on Drugs” and “War on Terror.” We might like to think of increased militarization as a result of our post-9/11 mentality, but it is really a symptom of policies from more than three decades ago.
In 1981, President Ronald Reagan passed the Military Cooperation with Law Enforcement Act, which allowed and encouraged the military to cooperate with local, state and federal law enforcement and render assistance via research, equipment and other assets to assist with the then-nascent “War on Drugs” initiative.
This act of government authorized the military to train civilian police officers to use the new high-tech weaponry, instructed the military to share drug-war–related information with police officers and authorized the military to take an active role in preventing drugs from entering the country.
Thus the precedent was set, inviting future legislators to pass laws in a similar vein and thus decrease the distinction between the military and police, all in the name of keeping drugs off the streets. Modern theories of policing define the police as civil-servants working through local government for the prevention of crimes and apprehension of criminals. Police are supposed to utilize a proportional amount of force as required by the situation whereas soldiers on a foreign battlefield may utilize any amount of force necessary to ensure the completion of the mission. But with the large amounts of military grade equipment and training made available to the police, the traditional mind set of police officers is changing to justify the use of these assets.
More recent legislature passed after the events of September 11, 2001 has transformed the issue from one of drug suppression to one of fighting the menace of terrorism. Legislature passed as early as the 1990s has resulted in thousands of pieces of military hardware, ranging from weapons to vehicles, being passed into the hands of the police for use on U.S. citizens.
This increasingly militarized police force could adversely affect police-civilian relationships as the general populace feels more and more like a people under occupation.
Recently, with the winding down of military operations abroad, the Department of Defense, along with the Department of Homeland Security and Justice Department, have made it easier than ever for local police departments to obtain military vehicles. Heavily armed and armored mine-resistant, ambush protected vehicles (MRAPs) have recently found their way into the hands of civilian police. 175 of these hulking behemoths of war had been doled out to various police departments across the U.S. when they first became available in the summer of 2013, and the number of requests for MRAPs has quadrupled in the past year.
Many civil liberties groups, including the American Civil Liberties Union (ACLU), have condemned the use of military vehicles in American municipalities, stating that the use of military and SWAT style tactics for simple arrests or warrant servings is far from necessary.
The purchasing of military grade equipment, ranging from body armor to the aforementioned MRAPs is made simple for even the smallest and most low key of police departments through a series of grants from the Department of Homeland Security. These grants are issued to “enhance the ability of regional authorities to prepare, prevent and respond to terrorist attacks and other disasters,” per the department’s own website.
Not only are these weapons made available to local police departments, but with these grants allow greater access to these weapons of war, a fact that has alarmed many Americans who hold strong convictions about the necessity of a civilian police force as opposed to a military or national police force.
In fact, an Associated Press investigation of the Defense Department’s military surplus program shows that a large percentage of the $4.2 billion worth of equipment that has been distributed over the past 25 years has gone directly to police and sheriff departments in rural areas with very few officers and low crime rates.
This overt militarization also comes at a time when reports on police brutality are occurring with more frequency than in decades past, contributing to an ongoing image problem of police in the U.S.
Every few weeks, stories about police brutality, accidental fatal shootings or other high profile run-ins with the police permeate media outlets and online blogs. Images of police officers raiding the Occupy encampments across the country with brutal efficiency, the beating to death of mentally ill homeless man Kelly Thomas by California police, the shooting of 18 year old Keith Vidal; these incidents, coupled with an increased emphasis on paramilitary training and mindset for police, could lead to a very serious breakdown in respect for law enforcement.
It is an unfortunate reality of our time that we live in an era of uncertainty. With terrorist attacks seemingly able to manifest out of nowhere, we rely on internal security forces more than ever for the protection of citizens. Training and arming police with the best equipment seems like a proactive step to helping them in their anti-terrorism responsibilities.
The problem arises when police begin utilizing these tools in their day-to-day operations; for example, serving a warrant for a non-violent drug offender does not merit the use of full body armor and SWAT-style raids. This disproportionate use of force on a nation’s citizens fosters resentment and suspicion toward law enforcement officials.
Ultimately, police militarization does more harm than good. Police come to be viewed as oppressors and citizens are viewed as potential threats. The result is that we as a nation are more endangered by our own police forces than by terrorists, and this reality causes distrusts of police on a fundamental level.
When you are more likely to end up dead at the hands of those sworn to protect you than those sworn to destroy you, who is the bigger threat?
— By Andrew Morsilli, a College senior from East Greenwich, Rhode Island