web analytics
The Invisible Truth: January 2009

Share this

Wednesday, January 28, 2009

York's Disastrous Public Relations Policy

Subscribe in a reader

By all accounts, the strike at York University has been hard on everyone directly involved, and it has been difficult for the families of those involved as well. As in most crises, however, it has not been equally difficult for everyone. For three months, I have withstood blizzards, minus double-digit degree weather, the stress of witnessing my colleagues physically threatened or attacked, and a steady stream of verbal abuse from a small, vocal group of undergraduates whose knowledge of basic grammar is lamentable. The president and senior administrators have avoided doing their jobs properly while being paid six-figure salaries, preferring instead to bargain through the media. The public relations section of York’s budget is at least twice that of the salaries of the striking workers, who do over 50% of the teaching at York.

Let’s look at the implications of this fact for a minute, and the rationale behind it. York seems to think that spending a vast amount of money on promoting the prestige of a degree from York, while spending as little as 7% of their budget on the overworked underpaid employees doing most of the teaching is a good idea to expand capacity. Well, let’s look at the results. A disgruntled workforce has initiated the longest strike in the post-secondary sector’s history, in Ontario at least. I know at least three extremely intelligent and talented graduate students that have dropped out in the last few months. Enrollments are down in most faculties. Has York’s strategy succeeded? Obviously not.

Like it or not, the public perceives education as the main function of a university, as shown by the tendency of media to emphasize the effects of the strike on undergraduate students. Granted, research is also an important function of our universities. However, considering the popular perception of the primacy of the educational function of universities, perhaps it is a better strategy to give graduate students and contract faculty a better deal and decrease the spending on public relations. As shown by the rapid rise of viral marketing and other forms of word-of-mouth and text-to-text marketing strategies, the field of public relations has changed dramatically. The money York has spent on advertising has basically gone down the toilet because of the resulting word-of-mouth discontent with it as an educational venue. Had the workers been satisfied with their contract, they would have been more prone to speak of York in glowing terms, and their social networks might have lighted up in York’s favour, rather than in their disfavour as the present situation has proved.

The obvious counter argument to this is that York is basing their pay of graduate students and contract faculty on norms for the sector. Maybe York should heed the rhetorical question my mother asked me whenever I told her I was about to do something bad because all the other kids were doing it: if they all jumped off a bridge, would you? Perhaps this strike can serve as a wake-up call for other universities.

Meanwhile, with class-action suits against York pending, where is the accountability? Senior administrators and some undergraduates have asked McGuinty’s government to intervene on their behalf. While I laud the simple act of becoming politically engaged, I think these undergraduates don’t understand the long-term implications of this intervention. Once they finally get their degree and enter the workforce, there looms this dangerous precedent of back-to-work legislation. The long sacred democratic right to negotiate working conditions through collective bargaining will have been forever undermined. Back-to-work legislation is by definition unconstitutional. These students are in effect shooting themselves in the foot in extremely slow motion. The wound will be no less painful when that bullet hits, though. And will the government hold York’s administration accountable? After all, they have massively mismanaged public funds.

I truly regret the negative effects of this strike on not only undergraduate students and my colleagues, but on the members of York’s staff, and the underpaid employees of York Lanes, the retail and service hub of the university, some of whose hours have been cut as a result of decreased business during the strike. The University has lost a lot of money from the decreased parking revenue. Tenured faculty no longer feel proud of their once mighty teaching and research institution. But you know what? I don’t regret going on strike. I know in my heart that my colleagues and I have stood up for justice and equity when no one else would.

Sunday, January 25, 2009

The Geo-metrics of Agriculture

Subscribe in a reader

Tuesday, January 20, 2009


Subscribe in a reader

Friday, January 09, 2009

Digital Technology, Memory, and Social Networks

Subscribe in a reader

According to the Torah, God made Adam out of dirt or clay, so according to it humans are, on the most literal level, of the earth. Adam’s descendents have now forged their own neo-Adams out of silicon, the most common element in the earth, and copper. The genesis of artificial intelligence has a longer history than you might suspect. In 1642, Blaise Pascal invented the first calculator (mechanical of course), the abacus notwithstanding. It was made of wheels and gears, quite in line with mechanistic views of the universe circulating at the time, and heavily influenced by clockmakers. In the late 1930s and early 1940s, several different large analog and digital computers were developed, especially to decipher codes used in the war. After room-sized computers, came the personal computer in the seventies, which quickly revolutionized everything from business to hand-eye coordination as people started playing video games. The PC took a couple more baby steps that posterity will remember more as a moon walk as it made itself over in the form of the laptop and the palm pilot. In 1973, Canadian Martin Cooper invented the cell phone, which operates through radio waves. As many non-digital technologies digitized (cameras, phones, the walkman), humans stepped into the realm of cyborgdom. The primary or auxiliary function of many of these digital portable devices (DPDs) is memory. With all these gadgets decked out with memories of their own, that moonlight as accoutrements, has the human memory suffered? Or perhaps we shouldn’t be so pessimistic in the formulation of our questions and ask how our memories have changed since the popularization of these devices in globalized culture.

Scholars believe that in the middle ages, because of the generalized lack of literacy (a privilege or a burden, depending on how you look at it, borne by monks in seclusion), cultural memory, and history for that matter, was preserved in verse. Troubadours were the wandering historians, putting oral stories generated by different communities into rhyme and meter. Fast forward to 1875 when Alexander Graham Bell invented the telephone in Brantford, Ontario, and made the first call from New York City to Chicago. As this new technology spread like a bushfire during a drought, people started memorizing or simply remembering phone numbers en masse to maintain their social and business networks. The rapid displacement of analog technologies in favour of DPDs has introduced external memory to our personal space. For many of us, this is a relief, as we don’t feel as compelled to perform the sometimes tedious task of memorization. Of course the minute we lose or misplace our cellphone, which also serves as a digital phone book, provided we haven’t transferred this contact list to four different DPDs (a task that is itself tedious and sometimes cluttered with frustration), we are in a serious muddle. A chain reaction of limitations on our actions is imposed on us from without; we feel a loss of agency, and only then do we realize the extent to which we are chained to our DPDs. During such a loss, we can experience a feeling of isolation, a sudden disconnection from our social networks.

While these devices have opened up new avenues for social networking, such as mass emails and online networking tools like facebook, which often reconnects people who’ve lost touch with one another – sometimes contrary to their better intentions – these artifacts of modernity don’t come without their knicker-knotting aspects. Whole new realms of techno-ethics and etiquette are arising, unbeknownst to some. Is there anything more infuriating than hearing a cellphone go off in a movie theatre, for instance? Mind you, it’s easy to forget to turn them off (unless there’s signs posted, in which case, there is no excuse at all), but there are people out there who seem oblivious to the world around them and actually carry on loud, long conversations during a movie, much to the frustration of many movie-goers. Then you have a debate raging over the place of cellphones in schools. As a teacher, I have experienced the frustration of students furtively playing with their cellphones during class. Yet parents assert their right to access their children at all times. This has produced the unexpected effect that children are losing their tenuous sense of independence. Cellphones have also made the possibility of cheating on tests and exams via text message that much more real. The classroom, a site to hone the memory, is not immune to the memory-proof digital commons land of cut and paste. The ease with which information can be looked up on the internet using search engines has made memory retention a somewhat quaint, even an archaic, talent.

There is a catch-22 lurking in the erosion of human memory at the hands of technology. It consists of remembering the ethics and etiquette of emergent technology, which is changing so quickly that it is hard to keep on top of appropriate usages and contexts. The instances of forgetting or simply being unaware of these ethics can serve as agents of social splintering rather than cohesion, generating conflict. The proliferation of mp3 players and musical phones has made the option of aural seclusion available in very public places, which can also result in an exaggerated sense of personal space and a sense of individualism that complicates sociality. While these technologies are touted as the answer to all our networking problems, they have the potential to alienate as much as cohere. Since their popularization, the boundaries between business networks and social networks have been dissolving, and work has found its way into the most private nooks and crannies of our life. We have entered a world of paradox, where our memories have been lulled into inactivity, and where we can cultivate spaces of isolation and yet be held accountable to our places of business as we change our babies’ diapers.

The phone number is practically the blueprint of short-term memory retention; perhaps it is no accident that its basic form is seven digits. Humans have evolved to remember seven items of information (such as a digit) for a span of about 15 seconds. To transfer these bytes of information into the long term memory, we have to work by imprinting them through repetition, translate them into images, acronyms, or use some other trick to remember them over long periods of time. To keep items in the long-term memory, we generally have to periodically retrieve this information to “refresh” it and keep it active. Now that many of us have DPDs, the necessity to memorize phone numbers has diminished. Phone calls are merely a matter of speeddialing or summoning the contact list, finding the right name and number, and pressing dial. It seems to follow that the diminishment of such an important skill in everyday life, the memorization of phone numbers, that keeps our memories active and strong, has the potential to drastically change our consciousness by making forgetting a more determining factor in our lives than remembering. Not only that, but to what degree are the benefits of new technology for social networks counterbalanced by socially divisive knowledge sets that develop between digital haves and have nots?