I cannot remember the exact moment I met then-Dean of Campus Life Bill Fox, but it was very soon after I arrived on the Emory Campus as a freshman in the fall of 1979. He was just about always smiling. He was just about always available. He was always in our midst, and it was evident he loved being with us, the students of Emory University.
I was fortunate to get to know Bill very well over the years. By the time I was heading towards graduation, I counted Bill among my closest Emory friends. That might sound strange to many…how an administrator would be such a close friend with an undergrad…but all the Emory alumni who knew Bill understand. He did not set boundaries or limitations based on age or rank. He opened his heart to those around him, no matter how young.
As an undergrad, I would get together with Bill from time to time to have lunch and catch up. I had the opportunity to take a class with him my senior year, where he had us journaling about the books he assigned us to read. Although we already had a strong bond, we came to know one another even better through that class.
When I left Emory to pursue graduate school, it was hard to say goodbye to so many people I loved at Emory, such as Bill. These were the days long before the Internet, e-mail, texting and Facebook. Long distance phone calls were costly. Keeping in touch took more effort, and sometimes it’s hard to remember how we did it then, but we did. I stayed in touch with Bill Fox. We had him come to our alumni club in Philadelphia, and he remained one of the most sought after Emory speakers for alumni events. Whenever I would visit Atlanta, Bill had advance notice so we could get together and catch up.
Soon after my husband and I became engaged, I attended an alumni leadership seminar at Emory. My then-fiancé flew down to join me at the end of the conference, so I could introduce him to my beloved university and to some of the people who had made those college years so special. My husband, Bill and I had a lovely lunch together at what was then The Depot.
I was very excited for my children to meet Bill, and his wife Carol, when I took them to Emory a few years ago. They had heard me speak about him over the years, and I also was looking forward to Bill meeting my kids. Unfortunately, Bill was not feeling well, and so that meeting did not take place, but while my family dined at the Sun Dial Restaurant at the Westin Peachtree Plaza, I left the table to speak with Bill when he called, happy to know he was close by, even if we did not get to see him.
Facebook has been abuzz with the news of Bill Fox’s passing. I almost expect his name to be listed as what is “trending” right now. It is comforting to read what others are writing about this special man. He touched so many of us.
When I think of Bill, his smile is the first image that comes to mind. Then comes his slow, lovely Arkansas drawl, saying a word he exclaimed often: “Wonderful!” There is warmth in his eyes. Concern. Interest.
Bill Fox was one of a kind. The many thousands of us who were fortunate to be at Emory when he was have benefited immensely from his leadership. When I learned Bill would be retiring in 2005, I was quite sad, finding it nearly impossible to imagine an Emory without Bill there. Now I am finding it almost incomprehensible to imagine a world that no longer has our Bill Fox in it.
Tali Segal is a member of the Emory College Class of 1983.
Courtesy of suvival198
Is the absence of mental illness mental health? This question was proposed by Emory’s own Sociology Professor Corey Keyes and serves as one of the central themes to the newly emerging field of positive psychology. This field serves as a way to counterbalance the traditional field of psychology’s emphasis on diseases and deficits, referred to as the deficit-or-disease-based model.
In other words, traditional psychologists often sought to simply identify their patient’s deficits and disorders, then remedy them. Positive psychology posits that we should be as concerned with our strengths as we are with our weaknesses. As I will discuss shortly, studies have shown that increased levels of positivity benefit us in a multitude of ways and actually alter the cognitive architecture of our brains.
Two terms can help us better understand this discussion: flourishing and languishing. Keyes describes flourishing as the presence of mental health or those filled with positive emotions. Languishing, on the other hand, is described in adults who possess incomplete mental health and are poorly functioning, both psychologically and socially. Put simply, those who are flourishing are the most satisfied with their life and are socially and psychologically healthy. Those who are languishing are more likely feeling “empty” or unfulfilled.
So now let’s get into the numbers, because after all, as the essayist Christopher Hitchens said: “That which can be asserted without evidence, can be dismissed without evidence.”
In a study spanning from 1995 to 2009 and including over 1700 subjects, Keyes and his fellow researchers found that the traditional deficit-based model was lacking and instead called for the need to invest in mental health promotion and protection. These conclusions were drawn after finding that nearly half of the subjects who were free of mental illness in 1995 stayed at or later changed to moderate mental health in 2005. This group was now equally as vulnerable to mental illness as the 17.5 percent of subjects who initially had a mental illness in 1995! Furthermore, nearly six in 10 adults free of mental illness had as high or even a higher risk of developing a mental disease as individuals previously diagnosed with a mental illness.
But why should we care?
Aside from the fact that leading a life absent of mental health (languishing) sucks, according to Keyes (2002), you are nearly six times more likely to suffer a depressive episode (more numbers for you). Further, your creativity, productivity and energy levels will all suffer.
But don’t worry, there’s hope!
Higher levels of positivity and well-being contribute to what positive psychology researcher Shawn Achor calls “the happiness advantage.” After three years of research in 45 different countries, Achor found that in both various schools and companies that our brains perform significantly better at positive than it does at negative, neutral or stressed. Further, our intelligence, creativity and energy levels rise.
Achor determined that our external assets (our job, our salary, our appearance, etc.) have very little to do with our long-term happiness. Whereas only 10 percent of our long-term happiness is predicted by these aforementioned external assets, 90 percent of our long-term happiness is predicted by the way our brain processes the world. For example, Achor found that 75 percent of job successes are predicted by optimism levels, social support systems and the ability to see stress as a challenge and not a threat. In summary, Achor states: “What we’re finding is it’s not necessarily the reality that shapes us, but the lens through which your brain views the world that shapes your reality.”
Unfortunately, many of us, myself included, have been following a broken formula for years. That formula is, “If I work hard, I’ll be more successful. Once I’m more successful, I’ll be happy.” However, according to Achor, this formula is scientifically broken and backwards.
Achor describes this unfortunate situation as, “first, every time your brain has a success, you just changed the goalpost of what success looked like. You got good grades, now you have to get better grades, you got into a good school and after you get into a better school, you got a good job, now you have to get a better job, you hit your sales target, we’re going to change your sales target. And if happiness is on the opposite side of success, your brain never gets there. What we’ve done is we’ve pushed happiness over the cognitive horizon as a society. And that’s because we think we have to be successful, then we’ll be happier.”
Sounds all too familiar, right? I know it did for me. But here’s the secret. We must first think positively. This will activate the previously mentioned happiness advantage, which will in turn increase our productivity, our energy and yes, our happiness levels, both short- and long-term.
Achor found that an exercise as simple as writing down three things we are grateful for each day for three weeks was enough to actually rewire our brain and allow us to think positively and more optimistically. Doing this trained our brain to not scan the world for the negative, but instead, we now were able to scan the world and seek out the positive. Find the good, think positively and you are well on your way to seeing the world in a new and improved way.
— By Matt Kohn
“Out of face, out of mind, right?”
It’s not until it’s placed right in front of you through a direct impact, or someone around you, that you know it’s there. The truth is, at some point, it will be part of your life. What am I talking about? Mental illness. One in four people in the U.S. experience mental illness every year. This means that your family members, friends, peers and even you can or will experience mental illness. You are probably thinking that this is very unlikely because you have not seen or experienced it yet. Mental illness is real, but it’s not talked about.
Most members of American society view mental illness as a taboo. We go out of our way to avoid it. We fear mental illness, we fear discussing it and we condemn those who suffer from mental illness. According to the annual report by the National Alliance on Mental Illness (NAMI): one in two are frightened by mental illness, two in five people think people with mental illness are a threat to society and “psycho,” “nuts,” “mentally ill” and “crazy” are the top descriptors for those with mental illness. As a result, these unfounded myths are helping to create a deep, dark hole into which the mentally ill are enclosed. I’ve had people who are very dear to me hide their mental illness. Their hesitancy to discuss their illness ultimately led them to take their own lives. The thing that affected me the most after these individuals passed away was people’s reactions to these situations.
After the death of a friend of mine, I was told my friend had a heart condition and passed away after some sort of heart attack. Two weeks later, a relative of mine, who is very close to this individual’s family, told me, “I have to tell you something, but you cannot tell anyone, because the family does not want anyone knowing. They were suffering from severe depression, they overdosed on medicine and took his life away. Apparently they had been suffering for years from it, but the family does not want anyone knowing. The mom seemed very embarrassed about it.”
At that moment I cringed and put my hand over my mouth. I found it disgusting that people so close to this individual placed him in such a dark place, and because of ignorance, pushed my friend to his death. Ignorance about mental health is a real problem in this country. Mental illness is the most stigmatized disease in this nation. Just the other day, I read an “Emory Secrets” post about an individual who broke down to their parents, admitting they were ashamed of having a mental illness when this individual is supposed to be okay.
I am proud of this person for speaking out because the majority of people do not. Just look at the impact mental illness has on college campuses. More than 45 percent of young adults that withdrew from college because of mental illness did not request accommodations. Half of these individuals did not access mental health services and supports, either. What I find most alarming is that the number one reason for not seeking help is the concern about the stigma. This stigma has pushed college students to a breaking point. Suicide has become of major concern for college campuses. Seven percent of college students have “seriously considered suicide” during the past year. Not only that, but suicide is the third leading cause of death on college campuses.
We can change this; taking a different point of view can do a lot. How? By letting individuals with mental illness know that there is nothing wrong with having a mental illness and by being a supportive and loving community. Take the community in Zanzibar, which views schizophrenia as a possession of a spirit. Schizophrenic individuals receive better treatment there than first world countries. It is important to note that the word ‘treatment’ is not simply psychotherapy and medicine, but it is every single step the individual chooses to take for his or her betterment.
Recent positive psychology studies show that the determinants of happiness are as follows: 50 percent is determined by genetics, 10 percent is by living conditions (i.e. socioeconomic situation) and 40 percent is up to the individual. Therefore, the individual has a vast majority of the control to flourish. If a mentally ill individual, with clinical depression, for example, takes the appropriate medication, therapy and engages in healthy life choices such as exercise, attending alcohol and drug free events and enjoying every single moment of life, then he or she can overcome this predisposition. Even though the “recipe” for mental health treatment is written, people seem to be lacking in ingredients. What do I mean by ingredients? “Ingredients” are the understanding about what mental illness is and knowledge of the resources available. People’s oppression of mental illness is not only creating an emotional cage, but is covering people’s eyes, not allowing them to know about the resources around them.
At the beginning of the semester last year, I was undergoing severe anxiety to the point that I couldn’t breathe or sleep. I contacted Emory’s Counseling and Psychological Services and found out about stress workshops, psychotherapy sessions, co-dependency group sessions and much more. After I learned about this, I thought about how students really don’t know about what is available to them here on campus.
You know why? It goes back to the same issue of people fearing judgment or exclusion, so they ignore their illness and do not seek resources. To reiterate, we need to change our perspective, and we need to be more open. The treatment options are there but society itself is hiding them from those who need them. Thus, if we view mental illness in an open and positive manner, individuals will seek the necessary help and support he or she needs. Everyone deserves to be loved; everyone deserves to love him or herself. By changing our perspective about mental illness, 61.5 million Americans can be mentally healthy. Help mentally ill individuals speak out and seek help. So, are you ready to talk about mental illness?
— By Eva Kassel
Mariana Hernandez | Staff
“Hi, how is everyone doing? Will we be starting with appetizers tonight? Have y’all had a chance to look at the drink menu?”
I have spent the last 10 months working as a server at a sushi restaurant in Buckhead. We are known for sake bombs.
When people find out about my part-time job, they always have a lot of questions for me. The most difficult question to answer, though: “How much are you paid?” Technically, I am entitled to the federal minimum of $7.25 per hour. In practice, my wages are extremely variable. According to the Department of Labor, because servers are “tipped employees,” or employees who “customarily and regularly receive more than $30 per month in tips,” my Georgia employer is only responsible for a minimum required cash wage of $2.13 per hour.
The difference of $5.12 per hour comes from customers. I am essentially paid in tips. At the end of the night, servers calculate their total sales and are required by management to “tip out” based on this figure. In other words, servers don’t keep all of the tips they make, but instead must redistribute some of their earnings to kitchen staff, the hostess, the bus boys and maybe the bartender. For me, this means that I keep approximately 80 percent of my tips (depending, again, on how high my tips were with respect to my sales).
Here is a simple example of how this works. Pretend you are a single person eating at my restaurant. Let’s say you order two specialty sushi rolls and two beers on draft for a total cost of $30, including tax. You pay in cash and leave a $5 bill on the table for a tip. At the end of the night, based on your $30 check, the kitchen staff are entitled to $0.68 (2.25 percent), the hostess to $0.30 (1 percent) and the bus boy to approximately $0.15. So, from your $5 tip, I will keep $3.87, or about 77 percent. Five dollars on a $30 check is fairly standard. It comes out to just over 16 percent. (As a rule of thumb, if you want to give a moderate tip in Georgia, leave cash equal to double the sales tax on your check.)
Let’s say that you were absent-minded (or vindictive) and did not leave a tip. I did not only lose your tip. Remember that I “tip out” at the end of the night based on total sales, not on tips. In other words, I not only failed to make money, but I lost $1.13 because I bothered to serve your table.
Thus, it is possible to take home less than minimum wage at the end of the night. If this happens, of course, the discrepancy will come back in my pay check at the end of the month (less Social Security and taxes). This does happen occasionally. For example, because of a bad tip night, I took home $40 for eight hours of work on the Fourth of July. Because eight hours times the federal minimum wage of $7.25 is $58, my employer then owed me $18 for this shift.
On a busy night in the restaurant, I can make up to $25 per hour. More generally, I can depend on about $85 on a week night and $130 on a weekend or $15-20 per hour (the difference made up in alcohol). My average tip is 22 percent. Serving is hardly a career option I would like to pursue after I graduate from Emory, but I do earn in excess of minimum wage. But then again, my experience is atypical for a server working in Georgia: My restaurant has a loyal clientele, we are located in an affluent Atlanta business district and I’m a chatty college girl with a knack for flattery.
I come into this discussion from a place of relative privilege. To begin with, I only work part-time and often less. If I have a few consecutive $60 nights, or if I don’t have time to take on a shift, my parents can step in to help me cover my basic expenses. Most of my coworkers don’t have a parent or a spouse who can be a stop gap if things go wrong. For example, last month one of my coworkers’ car broke down. The associated costs were more than he could afford and since then he has been unable to make repairs and has been taking MARTA to work.
Restaurant employees are some of the hardest working people I have ever encountered. Restaurants are high-stress environments. The back of a restaurant is a hot, frantic place filled with the sounds of sizzling food, the clatter of dishes and the back-and-forth shouting of cooks and wait staff in many different languages (at my restaurant, these are English, Spanish, Japanese, Vietnamese and an Indonesian dialect). Customers’ orders and extra requests send me running all night in an Euler circuit around the floor; a six-hour shift is a workout. Although servers tend to be an extroverted bunch, it’s hard to be unremittingly cheerful when confronted by the inevitable demanding or petty customer.
People employed by the restaurant industry are disproportionately the working poor. More significantly, these people are disproportionately immigrants and racial minorities. According to the Center for American Progress and the Restaurant Opportunities Center, a not-for-profit organization working to improve wages and working conditions for the nation’s low wage restaurant workforce, 40 percent of tipped workers are people of color and 23 percent are immigrants, compared to respective rates of 33 and 16 percent in the general workforce. Moreover, over 50 percent of tipped workers with incomes below the poverty line are racial minorities.
I am writing this article to inform you, potential Georgia restaurant patron, that a vulnerable and hard-working population is at your mercy. The Georgia General Assembly has allowed service industry employers to shift the burden of employee wages onto customers. As a result, servers are dependent on your largesse for their livelihood. It is essential that consumers understand wage laws and the concept of tipping. Tipping is not merely a courtesy, but a custom assumed by state and federal governments.
The next time you eat at a restaurant, spare a thought for your server. Poor college student that I am, I try to figure in a 20-25 percent tip for good service on top of menu item prices in deciding whether I can afford to eat out. If your server is rude or makes a mistake, either look past it or complain to the management — don’t take it out on their tip.
— By Rebecca Burge
Zachary Elkwood is a member of the Class of 2015. His cartoons appear in every Friday issue of the Wheel.
Most of the morbidity and mortality in societies is due to individual behavior. Individuals usually enhance their health by either avoiding risky behaviors (e.g., smoking) or adopting healthy behaviors (e.g., physical activity). These behaviors have led to increased interest in individual-level interventions that address health problems through lifestyle change. However, new orientations in health promotion interventions have included community action through the use of health education. Moreover, community-level interventions use lifestyles and environmental changes to promote behavior changes, which are able to address health problems. Health promotion programs should include not only actions to strengthen individual skills but should also focus more heavily on community efforts to change social and physical environments into healthy environments.
Individual-level intervention reflects incomplete intervention content. This is because of the lack of effective evidence, which eventually reflects inappropriate intervention delivery. For example, an intervention derived from a stage theory of behavior change should incorporate several key elements. Individual-level intervention that is used to modify risky behaviors is based on stage theories, such as Trans-theoretical Model (TTM). Stage-based theory suggests that behavior change occurs through different stages facing several barriers rather than a continuous process.
Previously mentioned information suggests that individual-level interventions will be more effective in addressing health problems, but this is not true because of the complexity of the factors that play a role on each individual stage of change. Individual-level intervention stages include: 1) those who have not decided to change their behavior, 2) those who have decided to change behavior and 3) those who have already engaged in behavior change.
The need for community-level intervention can be seen when looking at strategies used for HIV risk reduction. For example, Health Intervention Project (HIP), a community-based program, targeted African-American female drug users in order to reduce the risk of HIV infection. This project explores the difference in results between women’s individual characteristic intervention as opposed to community intervention.
Several women gave up crack cocaine by ending individual social relations and changing their daily life structures. However, women who were part of a 12-step community program were more likely to stop cocaine use and injection drug use. This is because these women were provided with positive reinforcement and social support.
Furthermore, some participants have said that the absence of community organizations, youth/elderly associations for health and social services were problematic. This is because individuals within these social networks or systems acquire information, change attitudes, develop beliefs, acquire skills and practice behaviors which eventually will have positive influence on changing bad behaviors. The community-level intervention assumption is that individuals make up large and small social networks or systems in order to promote health.
Community level interventions are more promising for addressing health problems and improving health because of the possibility of modifying the social environment. A study about the prevalence of coronary heart disease (CHD) risk factors found a 50 percent decline in the CHD mortality rate with the implementation of a community based CHD prevention intervention program community-level interventions are done to improve health outcomes by reducing the populations’ overall risk rather than just the individual risk. They further aim to decrease the disease burden through risk reduction strategies across community.
Moreover, community level interventions are aimed to reduce CHD risk factors, which are cost effective and a possible approach for reducing CHD rates and associated morbidity in the community.
However, community-level intervention, which is designed to create healthy social environment, is still in need of more development in terms of research because there is a lack of understanding of concepts, a lack in interventions that bring social change as well as a lack of feasible methods.
Individual-level intervention has been less effective because public health research often highlights the community level interventions while ignoring the individual level interventions. This is because of limited knowledge regarding how the efforts at the individual level rise as well as a limit in the understanding of the strategies needed to improve the public’s health.
Therefore, until we have more of an idea about how to increase understanding of the individual level approach to intervention, community-level intervention should be used.
—By Rania Al-Qudaihi
Health disparities are the main obstacles to achieving equal health rights, in which everybody has a good standard of health quality. In other words, wiping out health disparities should be prioritized in order to attain health rights. Health disparities can come in the form of disparities in health outcome or disparities in health care access. Even if people have an affordable health service and good health quality, there are factors affecting their health that lead to health disparities, which will be addressed later. In order to address health disparities among racial minorities, some considerations must be taken into account to diminish the effect of health disparities.
Socioeconomic status (SES) is one of the main obstacles facing health disparity programs. People with low SES usually will have low income, education and insufficient employment, so they will end up with poor health outcomes. Because of health inequalities in accessing health care, people who have poor health outcomes will have low incomes and the jobless rates will increase. This vicious cycle can be interrupted as the effects of health disparities decrease. Although some literature shows that SES is not related to racial health inequalities, other literature shows that SES factors (i.e., income, education and occupation) are important for addressing health disparities independently. Therefore, health politicians and health programmers must take SES factors into account when they target health disparities.
Racism exists in most countries. Unfortunately, its effects encompass health services and health quality, which lead to health disparities. Offering health services and good quality health care that is race-dependent will make certain population groups suffer from health inequality and end up with poor health outcomes and bad health behavior. Additionally, poor health outcomes will often exacerbate those conditions already affecting SES. Racism might also lead to residential segregation, in which people cluster in a certain location according to their race. Residential segregation will affect the housing and rental-housing prices, which could make poor people poorer and decrease their chances for getting jobs, affecting their health outcomes.
The aforementioned factors will increase the gap of health services among the population and will increase the burden of health disparities. Moreover, race and genetics are responsible for many inherent and familial diseases such as sickle cell anemia, thalassemia and non-communicable diseases, which increase health disparity burdens. Also, belonging to a minority group or having a different skin color could lead to perceived racism, in which a specific group of people psychosocially perceives that they are being discriminated against, whether it is true or not.
Culture and belief are directly associated with health behavior. Cultural norms such as language, religion, thoughts and customs will affect people’s health either positively or negatively. Additionally, race is a culturally constructed entity, and racism can be a result of a population’s culture. People are immensely affected by social and psychosocial environments in which they are embedded, and they are affected by their cognitive thinking as well. Thoughts and customs are responsible for many behaviors that lead to many health outcomes.
For example, smoking, alcohol intake, diet habits and physical activity are important measurement criteria of health status. As a result, they are an intrinsic part of measuring health disparities among people. Religion also might increase the burden of health disparities by, for example, preventing condom use, potentially making people more susceptible to sexually transmitted diseases.
Addressing health disparities is crucial, as it affects people on individual and economic levels. Although health disparities are inevitable, diminishing their effect on the population is a must. Decreasing the gap of culture in terms of health behavior among people might offer great promise in combating health disparities.
- By Abdulaziz Aloufi
In Integrated Chinese Level I Part II, a popular collegiate Chinese textbook, the character Gao Wenzhong complains to his friend Wang Peng that he is noticing some rather unfavorable changes in his physique. Peng responds that Wenzhong eats too much and needs to start doing daily moderate exercise in order to notice a positive change in his health. After trying some exercises such as swimming, jogging and playing basketball, Wenzhong decides that he cannot find the right sort of exercise for him and continues on with his daily habits. The story leaves the future health of Wenzhong to the reader’s imagination; however, it teaches a crucial lesson about the need to exercise while also teaching me how to say “fat” in Chinese. More importantly, the story reminded me of a glaring problem in American society today: a health care system that needs urgent saving.
Now, if you’re wondering how I suddenly made the connection between an educational dialogue and a national issue, let me first say that I was also watching “Escape Fire” that week. The 2012 documentary, directed by Matthew Heineman and Susan Fromke, uses evidence ranging from a doctor’s testimony to a saddening patient exposé to present a shocking revelation that our health care system is much closer to insolvency than we previously thought. It asserts that a significant portion of our health care expenses come from preventable chronic disease. While it does call for radical changes to be made in U.S. health care policy, the documentary also focused on alternative, proactive ways to approach the treatment of disease and improve health. This film wasn’t the first to call for fundamental change in American health care; however, the methods and evidence it utilizes make a convincing point. Ultimately, changing the trajectory of U.S. health care lies in the hands of those who built it: the people.
It was President Theodore Roosevelt who pushed for socialized medicine in his Progressive platform for the Bull-Moose party. It was President Lyndon Baines Johnson who passed Medicare and Medicaid, giving millions of Americans access to affordable health care. Recently, it was President Barack Obama who wrote into law the Patient Protection and Affordable Care Act (PPACA), which included a Patient’s Bill of Rights and gave health care access to over 30 million who were previously unable to obtain insurance.
Reform at the policy level is happening. However, rampant health care problems such as rising costs remain unsolved. For example, it is estimated that obesity alone contributed 12 percent of the growth in health care spending alone between 1987 and 2001. Rising chronic illness and a stagnant culture surrounding health awareness contribute towards a system that doesn’t care for health, but rather manages disease.
What’s worse about this problem is that poverty proves to be one of its main ingredients. Unhealthy foods are disproportionately subsidized to the point where it makes much more economic sense to forsake diet to get food on the table. Additionally, health care for millions of people remains inaccessible and is still considered a privilege, not a human right. Today’s system ultimately forces many to decide against health in the name of fiscal responsibility and then refuses to provide a safety net for said choices. It is a cycle which works to prevent the lower class from rising and ultimately raises costs for all.
My point is this: Americans need to take charge of their health. It is not a question of more or less red tape from the government, but rather a question of changing our culture to one which encourages fundamental lifestyle changes and puts the focus back on improving our health, not managing disease. It is necessary for Americans to realize that health is holistic and can be fundamentally improved over time. Studies have shown that going from no daily exercise to 30 minutes of moderate exercise daily contributes significantly to weight loss and boosts overall happiness. Scans can detect cancers in their early stages and neutralize them before subsequent metastasis. In “Escape Fire,” many patients noted considerable improvements in pain management and decreases in pain medication consumption after switching to acupuncture. Actively addressing health means prioritizing physical well-being, mental awareness of the implications of good health and actively addressing health problems early on before they worsen.
Many of the above preventive measures are expensive for millions of Americans, for much still has to be done to change health care policy; however, the heart of changing our health care system relies on the idea that the people’s health belongs to them. Creating a culture with this fact in mind will mean a catalysis of reform at the top, which will work to improve the overall wellbeing of the people. By making that first step in the home rather than in the emergency room, our population ends up much healthier and happier.
Additionally, emphasis of preventive health will end up cutting costs, making health care for all much more affordable. If the government were to invest $10 per person in programs encouraging preventive health, it would end up saving $16 billion over five years, money that could be pumped back into the system to deliver care through Medicare and Medicaid.
This would significantly aid access of private insurance as well, as health insurance companies would lower premiums due to less risk stemming from a healthier population. Lower prices will mean increased access, leading to competition between companies that will serve as a catalyst to drive prices even lower. There is no further proof than this trend that our health care system is built for the people, by the people.
Wenzhong serves as an example of what we shouldn’t be when it comes to our approach to health. While he is a nice kid, he unfortunately doesn’t realize the long-term effects of paying attention to his health. I urge the American people to not follow in the footsteps of Wenzhong.
- By Somnath Das
Courtesy of Tony Fischer Photography
Revolutionary-era political theorist Thomas Paine’s pamphlet “Common Sense” defended the American independence movement as a just cause. Though a short piece, “Common Sense” covered a lot of ground. The first section alone is rich with some brilliant revolutionary ideas. What is society? What is government? Why are the two so radically different? Paine had answers that still resonate today.
“Some writers have so confounded society with government, as to leave little or no distinction between them,” Paine wrote. “Society is produced by our wants, and government by wickedness. The former promotes our happiness positively by uniting our affections, the latter negatively by restraining our vices.”
Following his detailed explanation of the distinction between society and government, Paine carefully reminded his readers why governments had to exist. “Were the impulses of conscience clear, uniform and irresistibly obeyed, man would need no other lawgiver; but that not being the case, he finds it necessary to surrender up a part of his property to furnish means for the protection of the rest; and this he is induced to do by the same prudence which in every other case advises him out of two evils to choose the least.”
Since government is a necessary evil, it is important that we not place it in the hands of a sadist like HBO’s Joffrey Baratheon. Nor should we not allow our government to become too bureaucratic. We are best off as a society seeking individual and collective happiness to choose a government in which we are most equitably represented. That’s what the American Revolution was all about, wasn’t it?
Then why is it that today so many Americans complain about the government? I don’t mean individual complaints, but complaints that are voiced by large groups of people like women, minorities and the unemployed. The reason is that some nameless, blameless, rich and powerful individuals have directly loopholed the purpose of government as our Founding Fathers saw it and essentially transformed ours into a mechanism through which their vices are promoted.
The kind of representation that our predecessors fought for does not exist, not today and, quite frankly, it has not ever existed in U.S. history. Of course, we can all agree that completely equal representation will never be attained because nothing is ever perfect. But that is no excuse for the abominably unrepresentative nature of elections today. Anybody who votes, be it for a congressman or the president, does so on the pretense that said candidate can better represent their interests than any of the other candidates. What voters do not take into account is that they as a collective body of voters have indirectly yet undeniably been influenced through their favored candidate’s campaign donors whose interests take precedent over theirs because their pockets are lined deeply. In short, bribery in the form of campaign donations has taken over the U.S. election process.
Paine warned against this reality in which those who are elected form to themselves “an interest separate from the electors.” He made a straightforward imperative that would prevent such a reality.
Then he went on to make a compelling case against the British Constitution, expressing his sentiments towards monarchies in general, like how arbitrary it is to base political legitimacy and succession on heredity and ending the first section of “Common Sense” by pointing out the contradictory nature behind the reciprocal checking of power between the commons and the king that the British Constitution had allowed.
The concept of checking one’s power implies that this person is prone to making poor decisions. When checking becomes reciprocal, the implication changes. Now, all sides are wrong, yet each is somehow still qualified to challenge the errors of another. But wait, have we not adopted such a concept into our own Constitution?
The way Paine would see it, our current system of checks and balances is merely a continual circular argument between three incompetent branches of government. I do not think that Paine was at all against the idea of checking power as long as it was done by true representatives of the people, but that is not the case.
Thus it appears that we as a society are insane. We blur the line between ourselves and the government, which essentially prevents us from establishing a government that suitably restrains our vices. Sure, we as a society are probably much better off with our government than other societies are with theirs. But that does not excuse us from improving our government. Common sense is a virtue sought after among most clear-minded people. With the midterm election coming this fall, the time is ripe for us to finally put the virtue into practice and make our government a true representation of society.
- By Erik Alexander
Courtesy of JL
National Collegiate Athletic Association (NCAA) president Mark Emmert recently dismissed the notion of “pay-for-play” this past December, stating that, “There’s certainly no interest in turning college sports into the professional or semi-professional.” With all due respect to Emmert, he could not be more wrong. The question is no longer whether student-athletes should be paid – it’s through what avenues. The NCAA has attempted to paint it as a black-and-white situation, with players on one side demanding professional salaries and the NCAA on the other side trying to uphold traditional amateur athletics – they know anything short of that caricature would be their downfall. This is not the case, and with the rash of lawsuits, public scrutiny into the unsavory operations of big-time athletics and potential intervention from Capitol Hill, it behooves the NCAA to find a common ground before it faces extinction.
The traditional arguments for “pay-for-play” revolve around the obscene amounts of money that collegiate programs rake in and demand that the people who generate such revenue be compensated fairly. Before I proceed, it is important to note some of these arguments are only applicable to athletes in men’s college basketball and football.
The Northwestern ruling affirming the rights of the university’s football players to unionize will be hard to overturn in the subsequent appeals due to the lack of legal precedents and sets the framework for future conflicts at other private universities. Regional Director Peter Ohr hammered home the point that student-athletes had obligations and expectations from the university that equaled and even eclipsed those of standard employees. The ruling treats football players as standard employees, which raises questions about the other rights of employees that are inhibited by the NCAA and its member institutions – namely the right to work other jobs, profit off of their own image and likeness, determine their own housing arrangements and to workers’ compensation benefits for injuries. As the ruling anticipated an appeal, it’s highly unlikely that it’ll be overturned according to legal experts such as Lester Munson – meaning that this is the landscape college athletics will face soon. While this applies currently only to private universities, this will force the hand of public universities soon enough, as I will detail in the later part of this article.
Now, why does this matter? The argument made by Emmert (and what some consider a strong rebuttal) is that it is called amateur athletics for a reason – these players should feel fortunate to not be saddled with student loan debt and consider a degree from a university more than adequate payment for their services. In light of the Northwestern ruling, some have even called for the players to be taxed on their scholarships since it would be considered compensation. These arguments are completely oblivious to the data behind the situation.
Let’s start with the revenues issue. Northwestern’s program, which is far from elite, generated $235 million over a period of 10 years starting from 2003. The six automatic-qualifying conferences will rake in approximately $16.1 billion in television revenue alone by the year 2032. The NCAA itself reported $627 million in net assets as of last year. According to a 2011 report, the average Football Bowl Subdivision (FBS) football player is worth $100,000 per year, and the average basketball player at that level is worth approximately $265,000 per year, while the average athletic scholarship doled out in 2010 was worth $10,400. Research also shows that when a football team rises from mediocre to great, applications to the university rise by 18.7 percent. There is absolutely nothing amateur about those numbers – the NCAA can’t claim that they are protecting amateur athletics when their financial decisions have done nothing to reflect that philosophy.
Clearly these student-athletes in men’s football and basketball are not adequately compensated with their scholarships – but are they even reaping the benefits of a college education? The Northwestern ruling shined a light on the strict regimen that the coaches had for players. You can hardly expect players to take advantage of a scholarship when they spend 40-50 hours a week practicing and are beholden to their coaches when they’re not on the field. Also, a dirty secret colleges don’t want publicized is that these scholarships are on a per-year basis and do not have to be guaranteed for the full four years – the coach reserves the right to cut the player and eliminate the scholarship for any reason, including injury. One of the most prominent athletics programs, the Alabama football team, had over 20 players leave by choice or force between 2010-2011 and jettisoned at least 12 athletes due to “medical reasons.” A coach can also leave for another opportunity whenever they wish, while athletes will have to sit out a season or two if they wish to transfer, jeopardizing any professional aspirations and taking away from the earning years of their lives. Just to summarize, the only compensation the majority of these players receive – an athletic scholarship – is rarely utilized to its capability and is liable to be taken away at any moment and is held to a stricter standard than the contracts their coaches sign.
I’ve debunked the myth that these student-athletes are anything more than glorified employees, and also the myth that they’re adequately compensated. The last obstacle, of course is why the NCAA must modify its system to remunerate its athletes. The reason is very simple: The NCAA must do so, otherwise it’ll become defunct in the near future. If we are to hold that the Northwestern decision will alter the landscape for private universities, public universities will be forced to follow suit despite state labor laws if they hope to attract any athletes who will maintain their lucrative television deals. This will put them in direct conflict with current NCAA rules, which will lead to either the NCAA altering those rules or having its six major conferences leave to form their own organization. In this situation, it is better for the NCAA to be proactive rather than reactive, lest they be blindsided by the mass exodus of programs. The revenue will continue to come in as long as sports is played, regardless of what organization the teams are affiliated with.
This is where the characterization of “pay-for-play” comes in: this is not intended to argue that players should be offered a professional salary for their services. It would not be economically feasible, and many of the programs would go under. Rather, the players should be paid as any other employee has the rights to – besides the compensation of a scholarship, they should be entitled to workers’ health benefits, have guaranteed four-year scholarships, be offered a schedule that allows them to take relative advantage of those academic resources and be allowed to profit off of their own image. There is a latent hypocrisy when Johnny Manziel was investigated for signing footballs for a payment, when one simply had to type in “Johnny Manziel” in the NCAA online store to get an image of a Texas A&M jersey with his number 2 on it. Ultimately, these are changes that the NCAA needs to make to ensure the best for the student-athletes they represent and for their own existence moving forward.
- By Calvin Li
Following the capture of Constantinople in 1453, the Ottoman Empire ruled a vast multi-ethnic expanse from its capital in Istanbul, stretching across the Eastern Mediterranean, Middle East and Southern Europe. Following its defeat in World War I, the Allied Powers dismembered the Ottoman Empire. The Anatolian heartland of the empire subsequently fell to Turkish nationalists in 1923. Led by Mustafa Kemal Ataturk, the nationalists established the Republic of Turkey, which ceased to be the major regional power in the Middle East that the Ottoman Empire was, instead focusing on domestic development and Soviet containment with its NATO allies. In recent years, the Justice and Development Party (AKP), led by Prime Minister Recep Tayyip Erdogan, who has reasserted Turkey’s role as a major regional power in the Middle East, independent of its NATO allies.
While the Republic of Turkey for the most part consists of historically ethnic Turkish lands, since the state’s creation there has been a substantial ethnic Kurdish minority living in the southeastern region. Due to what the Kurds believed to be violations of their natural rights as a people at the hands of the Turkish government, the Kurdistan Workers’ Party (PKK) launched an insurgency offensive against the Turkish state in 1984.
In the ensuing conflict, which persists to this day, more than 40,000 people have died, including more than 4,000 civilians. Turkey, the United States and NATO have deemed the PKK a terrorist organization. This conflict has long been central to Turkish policy, but in recent years, the conflict has died down, and the AKP-led government has been able to focus on other policy issues.
The AKP has devised a foreign policy tactic called “zero problems with neighbors.” Under this policy, Turkey aims to create warm relations with all of its neighbors, which include Israel. Turkish leadership hoped that the zero problems doctrine would provide a way for Turkey to reassert itself as a regional great power in its own right, not merely just the most Eastern flank of NATO.
The zero problems policy’s first major test was during the Arab Spring uprising of 2011. Turkey was called on to respond to the uprisings because it is one of the Middle East’s most populous countries, the region’s largest economy and, at the time, considered a model of democratic Islamist government. The zero problems doctrine failed to be effective in the face of a multitude of regional problems.
In order to gain legitimacy amongst its Arab neighbors as a leading Muslim power, and not just a secularist Western pawn, Turkey took a dramatic stand against Israel in 2010 over an Israeli commando raid against a Turkish humanitarian aid flotilla headed towards the Gaza Strip. In response, Turkey ceased diplomatic relations with Israel. The Arab world lauded Turkey for its response. By creating this “problem” with Israel, Turkey was able to credibly demonstrate that it wanted to work constructively with its Arab neighbors as an independent actor of NATO.
The zero problems policy breaks with Turkey’s former foreign policy doctrine, which used to roughly coincide with that of the United States and NATO. For example, Turkey has helped Iran enrich its uranium, a measure that the United States dramatically opposes.
In Libya, Turkey helped unseat Gadhafi. In Egypt, Turkey supported the Islamist Muslim Brotherhood government of Muhammad Morsi that replaced the Mubarak dictatorship. When Morsi’s government was ousted in a coup in June 2013, Turkey heavily criticized the military regime that replaced it. Since then, Egyptian-Turkish relations have frayed dramatically, with the two countries ceasing diplomatic relations. Also in response to Turkey’s condemnation of the Egyptian coup, Turkish relations with the various Arab states of the Persian Gulf’s relations have grown much colder.
Turkish foreign policy has gone from zero problems to lots of problems. Turkey has become particularly embroidered in the ongoing Syrian Civil War. While in the onset of the zero problems policy, Turkey reproached the pariah Assad regime, Turkey quickly denounced the government with the onset of the civil war. Turkey harbors rebels in its territory, letting them use Turkey as a staging ground for the war against the Assad government.
Tremendous controversy has emerged in the past few weeks regarding a recording of a government meeting that was leaked to YouTube where top political leadership, including the “top spy chief,” the foreign minister and a top military official, discussed secret plans to invade Syria. The New York Times reports, “One option that is said to have been discussed was orchestrating an attack on the Tomb of Suleyman Shah, the grandfather of the founder of the Ottoman Empire, which is in northern Syria and is considered by the government here to be Turkish territory.”
This reemergence of Turkey as a dominant regional power in the Middle East has tremendous historical precedent. Turkey lies in a tremendously important geopolitical location, on the Anatolian peninsula, lying between the Balkans in Europe, the Black Sea, the Mediterranean Sea, the Caucasus and the Middle East.
In addition to its strategic location, Turkey also controls the headwaters of the Tigris and Euphrates region, which gives it control over tremendous amounts of water, vitally important in the arid Middle East. While much of the geopolitical discussion of the Middle East revolves around oil, sufficient water is equally, if not more, important. Because of recent dam projects, Turkey has the ability to export water all over the region and can even export water all the way to the West Bank.
While geopolitics dictate that Turkey will be a major regional player, the domestic institutions affect the way that the Turkish power manifests itself.
During the 1980s and 1990s, the Kemalist, western-oriented military led Turkey. They promoted Westernization and strived to join the European Union. But the EU did not reciprocate Turkey’s enthusiasm for the country to join the organization. This rejection worsened Turkey’s perception of the West, making it turn more inclined towards its old sphere of influence, the Middle East.
Concurrently, Turkey has become more democratic in the past 15 years. This democratization has also contributed to a foreign policy oriented more towards the Middle East. Turkey is a largely Muslim country, and its recent foreign policy has reflected this demography.
While the secular military leaders oriented Turkey more towards the West prior to the recent increased democratization, now that leadership is accountable to the public, its foreign policy has turned to the Muslim world. In 2010, when Turkey ceased relations with Israel over the flotilla incident, the public lauded the move.
In recent months, Prime Minister Erdogan has engaged in frequent anti-democratic measures that have brought into question the extent of Turkey’s recent democratization. After a round of recent elections, which Erdogan’s AKP did exceedingly well in, the Prime Minister swore “to make political enemies pay a price.” While the elections were free and fair, if these words are followed up upon, it does not bode well for Turkish democracy. In the past year, the Turkish government also brutally cracked down on peaceful protesters, imposed various constraints on free speech and free press and had high-level officials face corruption allegations.
It remains unclear the effects on Turkey’s recent autocratic policies, and if this frightening policies will persist. If these policies continue, the effects on Turkey’s foreign policy are unclear.
Regardless, Turkey’s recent rise as a regional great power in the Middle East will have dramatic effects on the unstable region’s future.
For decades, the Middle East has been subject to the influence of great powers not native to the region, such as most recently the United States. Hopefully the rise of Turkey, a mostly-democratic Muslim country, will provide a model example for this region that has so long been troubled by radical Islam, war and political unrest.
- By Ben Perlmutter
123...55Next Page 1 of 55