Friday, 11 September 2015

Where you were isn't as important as where you want to be

Today is the 14th anniversary of 9/11 and there is a popular hashtag on Twitter to mark the occasion: #WhereWereYou. Many of the responses are from younger Twitter users who report that they were in school, but a few responses are sobering:

"Watching on the street just a block away. These words ring in my ears "Mike, Those are people!" as we saw objects fall..."

"I was about to get on plane to bring poll numbers to White House. It didn't include a single question on terrorism or security."

"High up in my midtown Manhattan office. Associate ran in said TV showing WTC burning. I turned, looked downtown, and saw hell."

"A paramedic colleague jumped in his car from CT and drove down to help. He didn't come back."

"Walking to my office in the South Tower when the plane hit. May not be writing this if I was on time to work that day."

"I don't care about ‪#WhereWereYou, you're still here. Sep 11 isn't about your personal tale but about those who aren't here to tell theirs."

That last one raises a question: Why do we care where we were, where others were? Because geopolitics changed that day? Because it was the defining moment for a generation? Yes, to both of those reasons, and it makes us feel involved, connected to a horror show we felt powerless to ameliorate. All of our lives were affected by 9/11, but most of us were helpless spectators to apocalyptic destruction happening in front of us on live TV. Our trauma cannot compare to those directly involved, but the event shaped our politics, our attitudes, our outlook, in ways that still echo today, however muted by time.

So, I'll play:

At the time, I worked for a French company. We had a videoconference, in French, that morning with the head office in Paris. Just before I left my desk for the meeting, I saw on CNN that a plane had hit one of the WTC towers. I assumed it was a small, private plane and I hoped no-one had been killed. I remember thinking that it was early enough that the office the plane crashed into might have been empty, and if the plane did not immediately burst into flames, they might even be able to get the pilot out alive.

Although my French was decent, following and participating in a business meeting in French took a lot of concentration and I thought no more about the plane until building security came in and said we had to evacuate, that we might be a target. I should add that my office was on the 20th floor of Rockefeller Center – the famous 30 Rock, with the ice rink and the huge tree every holiday season. The floors below us were occupied by NBC, who were subsequently targeted with anthrax by mail. I never received another piece of post that hadn't been opened and examined, and I worked there for another year.

Somehow we were made to understand that the plane had targeted the WTC deliberately, and other landmark and government buildings were being evacuated around the country. I remember trying to explain to our French colleagues why we had to sign off so abruptly, but it was very confused. We walked down 20 flights of stairs and regrouped in a conference room at another office owned by the company, in a nondescript midtown office building on 6th Avenue that was not evacuated as a potential terrorist target.

At this point, there was a feeling we (meaning the U.S.) were under attack with no sense of how many more planes had been hijacked and where they were headed. My work colleagues and I tuned into CNN on the big screen in the conference room. Someone brought in the sort of food trays that are served at meetings, which we ignored. My boss's boss had a brother who worked at the Pentagon and she was desperately trying to reach him. She later learned that he was in his office when the plane struck, which was luckily on the opposite side of the building from the impact.

My boyfriend at the time, Hugh, was doing temp work. His current temp assignment was at the WTC. No mobile phones were working – signals were jammed from the volume of calls. I used the conference room landline to try to reach him without success. I know it sounds strange, but I was worried but not frantic. It was so surreal, so hard to take in, that I simply could not believe that he could have been harmed. My coworkers and I saw both towers fall in real time on the news and it felt like watching a movie – it was simply too much to absorb that we were seeing thousands of people perish, live, at that moment, just down the street from us. The adrenaline was pumping, making for an altered state of hyper-awareness that felt more like riding a roller coaster or watching an action movie than being in a real life warzone. I think it's a protective mechanism, our brains don't take in the emotional side at first so that one can function until the immediate crisis is over.

By coincidence, my father was visiting for a yoga event. The class he was attending started at 5:45am, so he had made a habit of going back to my flat for a nap afterwards. I phoned him and woke him, told him to turn on the TV. He didn't believe me at first. When I convinced him I was serious, he asked me what channel. Since I never watch TV, I didn't know what channel CNN was. I remember patiently explaining to him that there was a sticker on the back of the remote that listed all the channels, although it had been worn almost illegible, when he said, "Never mind; it's on every channel."

I then called a friend, a college professor that I thought might not have heard. It seems silly now, but I didn't yet realize that people in classes, people in their homes and offices with no radio or TV on, would have been informed. He knew, and he was worried about his son, who was also temping in NYC. He doubted he was temping at the WTC that day, or anywhere near it, but a parent cannot help worrying. I told him I didn't have any word on Hugh and I knew for certain he was temping in the WTC. I had seen both towers collapse. My friend had nothing to say – what can you say in that situation?

About quarter past twelve, on another phone check, my father told me that Hugh was home. At that point, I left the office and walked home. I always walked the 50+ blocks to and from my office, but all public transport was shut down and the streets were crowded with people, some milling around uncertainly and some trying to get home on foot. I was training for the NYC marathon at that time and I had a painful groin pull that had given me a limp. I don't remember feeling it the whole way home.

It sounds horrible to say this, but I have to be honest and report that there was a bit of a holiday atmosphere all up along Broadway. It was a gorgeous sunny day, clear blue sky, 70s, that felt like summer at its best. Stores were giving away food and beverage to people walking by, and everyone was being really nice to each other. As I said, the shock, the adrenaline, kept the full knowledge of what was happening down the street from sinking in yet. People simply could not take it in and did not know what to think, but they were on their best behaviour. New York is a city of neighbourhoods, and those neighbourhoods can be close-knit, with good people who help each other.

When I got home, Hugh told his story: His train went past the WTC stop and let everyone off at the next one. This is a frequent occurrence in the NYC subway, "due to a police incident", so Hugh's only feeling was of annoyance that the walk back up from the following subway stop was going to make him late for work. When he emerged onto the street, it was to see the towers smoking, crowds screaming and running, and cops herding the emerging subway riders east, away from the WTC. He walked north, skirting as far east as he was forced to by cops and crowds. At one point, police were directing everyone down the steps into the City Hall subway entrance. He had a strong instinct not to get trapped underground but he was herded down with the crowd. When he saw the station and the platform were a solid sea of people, survival instincts drove him to fight his way back up the stairs to the street, right into a huge cloud of dust – the first tower had just collapsed. He made his way north, past people white with dust, floating pieces of paper landing on him, to Houston Street. There he saw a woman with her hand over her mouth screaming "Oh my God, oh my God" over and over and pointing downtown. He looked back in time to see the second tower collapse. He then walked home, well over 100 blocks.

We got take-out and camped in front of the TV. I called local hospitals in the evening, to see if I could go give blood – the feeling of helplessness, of wanting to do something was overwhelming. (The Onion captured this feeling well in its now classic 9/11 coverage). But the hospitals were overwhelmed with offers to donate and were advising people to stay home. The grim truth was that there were no survivors needing blood. Medical staff not on duty had raced to emergency rooms to help cope with the casualties, who never arrived. They were also refusing volunteers on site. Unless you had Red Cross or other formal disaster relief training, you weren't allowed south of Houston Street.

My office – indeed, most of the city – was shut down the next day. The building where my father's yoga event was being held was turned into a makeshift morgue, filled with charred body parts waiting to be matched with missing loved ones. My dad, Hugh, and I walked in the park, and went out to a cafĂ©, just to get away from the TV coverage for a few hours. But there was no relief. The smell from downtown permeated the air and served as a constant reminder. The TV was now listing names of the missing, showing photos, interviewing crying relatives. The scale of the death, the vague numbers of passengers and office workers, were now being replaced with individual names and faces and life stories. The NY Times ran bios of each victim, every single one, over the following weeks. Nothing seemed to matter; everything paled in relation to the scale of the senseless loss of life. The news reported the search for the missing, the dogs sniffing at the smoking rubble, the hope fading. They hid rescue workers for the dogs to find because they were getting depressed at finding only body parts, no living victims.

The following day, we went to an Indian restaurant downtown and saw the memorials to the missing, those now iconic street corner collections of candles and photocopied pictures. It is gut-wrenching to recall even 14 years later.

My father's flight was cancelled, but eventually he got home. Work resumed at the office, albeit with the anthrax scare keeping us all jumpy and leery of our formerly beloved landmark building. I put off running the marathon until the following year, ostensibly due to my injury, but also because my heart was no longer in it. It seemed so selfish, a petty goal in the face of the loss so many had suffered. But I did run it the next year. Life went on. 9/11 became something we now talk and think about only on its anniversary, when the names of the victims are patiently read aloud, a process that takes many hours, to let their relatives know they are not forgotten.

My memories of the event matter only to me, and not even to me so much anymore. #WhereWereYou and other "never forget"-type memorials are meant to help us avoid repeating horrors in our history. This is not a political essay, but I think it's safe to say that the U.S. response, and subsequent destabilization of the Middle East, have made the world less, not more, safe. It might be more useful to start a hashtag #WhereWillYouBe to start a dialogue about the world we want to live in, the safer, kinder world we had hoped to create in the initial global goodwill that blossomed after 9/11.

Wednesday, 11 February 2015

Social mobility, social welfare policies, and the myth of the American Dream

One question I get every day is why is the U.S. the only developed country without universal health care?  A related question is why is the gap between rich and poor increasing?  & why does the U.S. have such a different attitude towards the poor than the rest of the world?  This is but a cursory answer, and it begins with a little comparative history.
In Europe, there was an almost total lack of social mobility prior to the 20th century.  The socio-economic class into which you were born completely determined your opportunities in life.  This was understood by everyone rich and poor, aristocrat and commoner.
In the U.S., (leaving aside for our purposes legal discrimination based on sex, race, etc.) the class system was formally abolished.  Everyone was entitled to 40 acres & a mule and, if you worked hard enough, you could pull yourself up by your bootstraps.  A boy born dirt poor in a log cabin could become president.  A barefoot child laborer could rise to become factory owner and robber-baron capitalist.  It is the myth of the American dream.
Now think about what attitudes toward poverty are shaped by these two scenarios.  In Europe, poverty is an accident of birth, so the poor deserve help to level the playing field.  And Europe has created social welfare programs that were designed to overcome socio-economic barriers.  The problem in the 21st century is that those barriers are long gone.  It has been nearly 100 years since people were forbidden to attend university if they hadn't been born into a high enough class.  Even the heir to the throne in Great Britain has married a commoner.  But the attitude that there is nothing you can do to better yourself persists and it is a problem that Europe needs to sort out, along with immigration, but that is another post entirely.
In the U.S., the myth that the playing field is level, that everyone has an equal chance to succeed if they work hard enough, has resulted in a blame-the-poor mentality.  There is a belief that, in this land of opportunity, it is your own fault if you are poor, so why should people who have worked hard for their own success help you?  This is why there is universal health care in every other developed country except the U.S.  Americans blame the poor for not being able to afford health care and other necessities.
In the post-war era, when the social welfare programs that we do have in the U.S. were launched, there was a temporary shift in attitude because of the Great Depression and WWII.  There was a shift in Americans looking to the federal govt. rather than the states to solve problems because the problems during the Depression were too large for states to handle individually.  The frontier had closed at the beginning of the 20th century, the population was increasing, and urban slums filled with poor immigrants working for obscenely low wages in dangerous working conditions were growing during industrialization.  As the population urbanized, we went from a self-sufficient agricultural economy to a dependent urbanized one -- i.e., instead of producing what they needed to survive, people worked for wages, which they spent to buy these necessities.  Living in crowded conditions near employers, they could no longer produce goods for themselves, so they became dependent upon wages.
All of these things (closing of frontier, immigration, population increase, urbanization, industrialization, Depression) led Americans to take a more European attitude towards poverty, that it was not laziness on the part of the poor, that the playing field was not level with equal opportunity for all.  There was also a different attitude towards women working outside the home.  So-called welfare was originally meant to support widowed and abandoned wives with children, who were expected to stay home with their kids.
Now, fast forward to the 21st century.  These social programs were created before most people living today were born.  There are families where succeeding generations have been born to mothers on welfare.  There is an argument that dependence upon social welfare has become a way of life rather than a temporary safety net for the truly needy.  Those who oppose aid to the poor, including universal healthcare, believe that the playing field is now level, that those in poverty are simply too lazy to work and earn money for themselves.
The reality, of course, is that the playing field is far from level.  A child born into an urban ghetto with failing schools and gangs does not have an equal chance of financial success as a child born into a middle-class suburban community with good schools and Scouting.
Most of the debates between proponents and opponents of social welfare policies dance around this central ideological difference, but it is simply that those who oppose helping the poor believe poverty is their own fault and those that favour helping the poor believe that their socio-economic circumstances are beyond their control.  & n'er the twain shall meet.
The way to end poverty is not to continue the social welfare programs of the past (most of which have been updated already) but to look at the root causes and address them with a "teach to fish" not "give a fish" model of assistance.  But that is not going to happen as long as society is split in where it places the blame.  As I always say, you can have your own opinion but you can't have your own facts.  Both sides need to come to the table with the same set of facts before meaningful discussions about solutions can occur.

Thursday, 7 August 2014

Book Review: Year of No Sugar by Eve Schaub

Eve Schaub and I have a lot in common. We are both 40-something writers who have adopted Vermont as our physical and spiritual home.  Seduced by the New England small town and rural aesthetic, we know our way around a farmers' market and can wax rhapsodic on the virtues of local, seasonal, and organic everything.  We are both lapsed vegetarians, a transformation in attitude occasioned by the availability in areas like ours of locally and humanely raised meat. For vacation we'd both go back to Tuscany before we'd go anywhere else. Finally, and most relevant here, we both became aghast at the insane amounts of sugar in American food and attempted to wrest some control of the volume of sugar in our diets from multinational corporations and well-meaning bake sale vendors.

For Ms Schaub, her sugar epiphany was catalysed by a video on the evils of sugar. Whilst far from being ignorant about healthy eating, she experienced a dawning horror at the amount of sugar her family, particularly her two daughters, was mindlessly consuming despite her avoidance of fast food and obviously empty calories such as soda.  Avoiding sugar entirely seemed impossible but, not being one to shy away from a challenge, that is exactly what she set out to do, for an entire year, with her family joining her.

Predictably, there were tears from her children, but her husband, leery of radical diets from having grown up with a father who experimented with bizarre dietary extremes, was surprisingly game. Her book chronicles the setting and enforcing of the no-sugar rules for the year, which budgeted for monthly treats, a "birthday party rule" that allowed the girls to make their own choices about what to eat when they were away from home and their peers were eating sugar, and a personal exception for each family member – the one form of sugar they could not live without for a year. The difficulties of shopping, dining out, eating at someone else's house, negotiating holidays like Halloween and Xmas, are all described with unvarnished candour.

They made mistakes, especially in the first few months, not realising, for example, that balsamic vinegar contained sugar.  I think my favourite mistake was when they bought their daughters some strawberries and plain yoghurt for an afternoon snack in Florence only to discover the yoghurt was actually whipped cream. I bet the girls were in heaven. Some recipes adapted to be made with dextrose (an allowed form of sugar in her experiment) were a success but others failed to gel, literally. As an appendix, Ms Schaub lists recipes that are sugar-free as well as recipes for their monthly sugary treat.  The latter I found a bit odd as recipes with copious amounts of sugar are, to put it mildy, not difficult to find and somewhat at odds with the tenor of the book.  But I understand that readers might be curious about them, considering how evocatively she describes their monthly sugar mirage, and how carefully they selected their most beloved family recipes as treats.

Another thing Ms Schaub does not sugarcoat (sorry—I was going to have to use it at some point, so best we get it out of the way) is her children's reactions.  But it is clear that neither girl will have cause to look back on it as the Year from Hell.  The number of exceptions, the sweet but sugar-free treats made from dextrose and fruit, and their opportunity to eat sugar at school if they so chose, hardly made it a literal year without sugar.  The scary thing, as Ms Schaub notes, is how much less this still notable amount of sugar was compared to a typical year, let alone a typical American child's diet.  Remember, these were kids who had a mom that bakes bread and who had never set foot in a fast food restaurant, so their sugar consumption was already far below the norm. The most striking point, which the author emphasises repeatedly, is that the girls adjusted in many respects more easily than the adults because they had less time on earth to become addicted to sugar.  At five, the youngest was the quickest of the entire family to adapt, and by the end of the year the palates of the entire family had changed to such an extent that they willingly chose to eat less sugar even when it was allowed. 

In addition to the gradual alteration of their palates, another point the book makes is how the process of avoiding sugar in 21st century America exacts a mental toll due to the vigilance necessary to police the sugar content in every morsel that drops into our shopping carts or passes our lips.  As consumers in America, we have more nutritional privilege, more choice, than people virtually anywhere else on the planet, but that illusion of choice evaporates when faced with an entire aisle of cereals or sauces all of which contain some form of sugar.  Skipping dessert, as Ms Schaub explains trenchantly, does not cut it. Go to a cookout steeling yourself to resist the s'mores and find that the buns, dogs, condiments, side dishes, even the chips, all contain hidden sugar – and the only drinks without real sugar contain poisonous artificial sweetener, which is worse.  Any event of significance, from major holidays to ostensibly healthy occasions like a 10K fundraising run, is accompanied by vast amounts of sugar.  Avoiding it requires superhuman will power or complete social isolation.  During her family's year of no sugar, they employed both of those tactics, along with the aforementioned judicious exceptions to make holidays and birthdays bearable.

I have taken a far more moderate approach to the sugar problem–there was never any possibility of me having the will power to give up sugar entirely, let alone when on holiday in Florence, surrounded by gelato–but I experience this same frustration and horror at sugar, sugar everywhere.  When I picked up the book, I knew exactly how the author was going to react when she started looking for the hidden sugar in everything because I have been there, and continue to rage impotently against the purveyors of sugar.  I, too, peruse labels and cook and bake from scratch. That is simply a necessity. I wish I did not have to. It would be nice to stop for an ice cream with friends or enjoy a cookie in a coffeehouse that did not contain ten times the amount of sugar necessary.  Also, as Ms Schaub notes, making everything is time-consuming. I prefer baked goods and ice cream with much less sugar than in purchased varieties but it is not practical to always bake or make my own ice cream.

But the more prepared foods you eat, the more sugar you ingest, and any food, whether it contains sugar or not, is more satisfying when it is homemade.  The ultimate lesson of her book is not just about avoiding sugar for health reasons but about appreciating food. Mindless eating is invariably less healthy than cooking local, seasonal, organic, fresh ingredients from scratch. Kids, she notes at the end of her missive, are inherently aware of this. They will take homemade bread over store-bought cake. Children know, she concludes, what is special. That sense is something we have lost in the world of corporate food where sugar is used as a drug to stimulate unhealthy consumption and addiction. It is less will power than an appreciation for real food that may save us.

Thursday, 10 October 2013

How To End Welfare As We Know It – Really!

Ok, Rethuglicans. I see your welfare disdain and raise you a plan. Rep. Steve King of Iowa said in July that the U.S. has a "cradle-to-grave" welfare system that encourages dependency. You know what? I agree. Wait, hold your shrieks of outrage and/or your applause. Let me explain: The U.S. welfare system does encourage dependency. It is difficult for someone receiving benefits – Rep. King noted that there are at least 80 different programs – to become financially independent and no longer need this assistance. There are, to put it mildly, disincentives to leaving some of the programs. But the reasons are not what Rep. King and other cons think they are. In fact, the reasons are of their own making.

Reason #1) Lack of universal, single-payer health care
One, single-payer health care system for ALL Americans would reduce welfare dependency more than any other policy change. Mothers on benefits who get a job lose Medicaid for their child(ren). If they do not get insurance with their job, which is common for low-wage workers, and they cannot afford to buy coverage, also common, they have little choice but to quit their jobs if their child becomes sick or just not get a job in the first place. Welfare case workers have plenty of stories of women who work low-wage jobs with no benefits calling them up & saying, "My child is sick, what do I do now that I am working and no longer have Medicaid?" "Quit your job so your child can go to a doctor" is the response they get.  This also applies if the mother, rather than the child, is ill.

The phenomenon of people needing to keep their income below a certain level in order to maintain their eligibility for government-subsidised health care increases the use of benefits substantially. The same situation applies to people on SSI disability. They may be able to do some work but they would lose their disability payments if they took a job, even a part-time one, so they don't work at all or only under the table.  Most people who are not totally disabled could do some work, just not enough to survive.  Despite some attempts at reform, most welfare programs are all-or-nothing.  You earn, you lose.

Beyond mothers and children, medical expenses are the single greatest source of financial strain on Americans. People who lose their jobs and their homes when they become sick or injured sign up for unemployment, food stamps, Medicaid, and other benefits. All because of medical bills they wouldn't receive in a rational, humane health care system. I just read about a cancer victim who could not afford surgery – no insurance provided by his employer – who lost his job due to his declining health. Once he was unemployed, he was eligible for Medicaid and received treatment.  He was also now eligible for unemployment and food stamps.  So, we have a man who was gainfully employed full-time, receiving no government assistance, who winds up on three benefits programs due to health care expenses.  If we had universal health care, he could have received immediate treatment without losing his job.  He may wind up on disability -- a fourth program -- which he could have avoided had he gotten treatment earlier, before the cancer spread.

Universal health care would solve this, and many other problems. (Obamacare, alas, won't.) You cannot decry the use of social welfare benefits in one breath and oppose universal health care in the next; they are inextricably linked and inversely correlated.

Reason #2) Lack of a liveable minimum wage
We've all heard by now about Walmart workers being eligible for food stamps, and about many other companies scheduling workers for hours just below the threshold at which they would have to provide benefits (another problem that would be solved by universal health care!). I realise that many of the workers on benefits are not working full-time but that does not imply that full-time wages at $7.25/hour would support them; rather, full-time hours would still not pay a living wage but would trigger federal laws requiring their employers to pay benefits. So, it amounts to the same thing.

Listen carefully: No-one who works a full-time job should need government benefits to make ends meet. By definition, every full-time job should fully support an individual. In fact, the Rethuglicans who want women in the kitchen barefoot and pregnant and not in the paid workforce should be in favour of all full-time jobs supporting an entire family, wife and kids included. It is absurd that someone who works full-time should have to avail him- or herself of social welfare benefits just to keep a roof over their head and eat. The point of working full-time is that you are making a living. No full-time job should be allowed to pay so little that a person living within commuting distance of it cannot live on their wages. It should simply be illegal. Full stop.

So, if you oppose a minimum wage, don't complain about people receiving benefits to make ends approach.

Reason #3) Lack of adequate family-planning resources
The classic image of a welfare recipient is a single woman with children. The Rethug strategy for reducing out-of-wedlock births is to eliminate sex ed., impose abstinence-only education, and hinder the availability of birth control and abortion. If you want to reduce the number of single moms on welfare, the sensible strategy would be to enhance sex ed., and reduce the cost and increase the availability of birth control and abortion. Make it mandatory for all schools to teach accurate sex education and explain, repeatedly, to teenagers all birth control options and the importance of using them correctly each and every time they have sex.

Remove the stigma associated with abortion. Make it a given that anyone who cannot support a child without government aid will have an abortion if they get pregnant. I don't mean make it mandatory – we are not China -- I mean make common sense the cultural norm instead of religious drivel. Create a culture where having sex without using birth control conscientiously or continuing a pregnancy when you do not have the money to support the ensuing baby is simply unconscionable. Just not done. Unthinkable. Do that instead of making abortions harder to obtain and allowing insurers to opt out of covering birth control and you will see far, far fewer women on welfare. 

Whilst we are on the subject, payments to the disabled constitute a considerable welfare expense. Since we have the technology to diagnose foetal abnormalities in utero, why not make it a moral imperative to abort foetuses with serious issues instead of spending tax money on their lifelong care? Rethugs applaud when a pregnant woman says she won't abort AND they balk at paying benefits. But they don't see the clear connection between them.  And I haven't even addressed the people on permanent disability benefits because they were the victims of gun violence or members of the military disabled in unnecessary wars.  The Rethugs blame the Dems for the size of the welfare state, but the majority of it is of their own making and perpetuation.

These three reasons are just the tip of the iceberg. The point is that if you say that you don't want citizens sucking on the government teat long-term, or at all, then you need to create a country - socially, economically, and legally - where citizens have both the incentives and ability to take care of themselves. The people who tend to screech the most vociferously for the government to get out of people's lives tend to be the ones most dependent upon its services and the ones most likely to favour the very policies that increase dependence.

Tuesday, 23 April 2013

In praise of telecommuting: The slacker work ethic

When I worked in an office, I got nothing done.  I mean, literally nothing.  I viewed the hours I was stuck there, rather than any work I might produce, as the price of my paycheque & benefits.  Physically being in my cubicle, at my desk, from X am to X pm, was my job.  I had to answer the phone if it rang, go to meetings, deal with requests from bosses, but only at a bare minimum.  It was mainly clock-watching.  Before the Internet, it was harder to kill the time and look busy.   But for nearly 20 years now, both work and slacking off have involved staring at a computer screen.  My day revolved around deciding what I would have for lunch & afternoon snacks.  I never ate breakfast before work (not even once in all my years of working in an office, no exaggeration) – eating it at my desk gave me something fun to do whilst waiting to get lunch.  As you would expect, on the rare occasions when the job was busy & I was actually working, the time flew by much faster than on the days when I was staring at the clock, reading Fark, or trying to sneak a book on my lap under the desk.  When I had freelance work, I would try to do it at my day job.  Since I was trapped there for X number of hours, might as well use that time to get it done rather than my precious free time.

When I segued from cubicle dronedom to freelancing full-time, my work habits changed drastically.  When working from home, I was trapped at my desk only as long as it took to get the work done.  I didn’t have to sit there until 5:00pm if I finished.  I could do a million other projects, whether chores or hobbies.  I could reach a certain point in an assignment, go out for a run, and come back to it, refreshed and ready to tackle the next stage.  If I wanted to shop when the supermarket was less crowded or hit the gym when it was least busy, I could go in the middle of a weekday & work in the evenings or weekends.  The key difference was that I was no longer merely putting in time: I was rewarded for productivity, not being in a certain place for X number of hours each week.  I was compensated based on the quality and quantity of my work, not whether I was 15 minutes late or took an extra vacation day.  The biggest incentive to get work done was that I was free when it was finished – not a moment sooner or later.

So, you can imagine my reaction to the recent corporate trend to bring telecommuters back to the office.  Obviously, I think it is a huge mistake.  You want people to be productive, you don’t put them in a cubicle, you let them work sitting next to a swimming pool.  You can bet they will get that assignment finished so their butt can be in that pool as soon as humanly possible.  Put that same person in a cubicle and tell them they cannot leave until 5:30pm, and I will show you a person reading Amazon reviews and web comics all day long.  I’ve noticed that I barely have time to skim the headlines of the papers since I began telecommuting.  I really have to think an article is worth my treasured time to read the whole thing.  When I worked in an office, I read the paper cover-to-cover every day.  Remember when games like Tetris came with a panic button that switched the screen to a spreadsheet if someone walked by?  And when corporations started blocking websites to prevent their employees from checking their personal email and doing online shopping on "company time"?  You won't find a virtual worker slacking off because they are paid for results, not time.

To be fair, one of the reasons put forth for ending telecommuting is that face-to-face contact is useful for brainstorming and innovation.  I will grant you that the idea-generation and problem-solving stages of a project can benefit from group brainstorming sessions.  But that is what meetings are for – whether in person, or over Skype.  I have no objection to the theory that innovation needs collaboration but that necessity does not translate into a blanket ban on telecommuting.  Making people come into the office, in person or virtually, for facetime is reasonable.  But the high-handed command at Yahoo! to come back into the office full-time or quit was extreme, and will end up being counterproductive.  There is also a certain irony to a technology company banning working over the Internet.  It also does not help their image that the CEO is filthy rich.  A lower-paid employee could never afford the childcare arrangements she has made that enable her to put in long hours at the office despite being a new mother.  At best, she suffers from a lack of empathy.  I expect she has no clue how the little people beneath her live, and cares less.

Telecommuting is cheaper for employers, and it is a necessity in a society that does not provide paid parental leave.  It has become increasingly viable due to new technology, and it will continue to become practical in more industries.  It will never be an option in hands-on service industries (a virtual firefighter or hair dresser – uh, no) but the people who enter those professions know what they are getting into.  Trying to roll back time and turn people back into clock-watching cubicle zombies will lower productivity, not raise it.  I can vouch for that.