Search

misreportedandmisremembered

Bringing context and perspective to the chaos

Make Taxes Sexy Again

Only two things in life are constant: death and taxes. And while the concept of death and mortality has made many an author and poet a rich man (speaking relative to the average poet of course), very rarely are great works composed about the intrigue and humanity of the modern tax system. Even when the two are combined, never has a prose been eloquently penned regarding the injustice of the Estate tax imposed upon the transfer of property rights whenever a land holder so thoughtlessly expires.

Some may ascribe the lack of interest in standardized methods of taxation as evidence that they are inherently boring systems – not so. The very creation of a tax itself, in which a charge is imposed upon a consumer to acquire a capital pool used to fund other projects, is a fascinating concept: What should be taxed? How high should any given tax be? What should the money, once acquired, be used for? How can this system, seemingly designed to be corrupted, circumvent the inevitable issue of sticky-fingered bureaucratic interests? The answer is, as with all things, both enormously subjective and fiendishly complicated.

The basics: A tax is a levy imposed to gather money to fund public expenditures, and often defined by the fact that not paying this charge is punishable by law. Taxes are often issued by a governing body, typically though not always a legitimate government. Taxes can be either direct (sent directly to a governing body) or indirect (gathered through an intermediary, like paying a 15% levy on groceries purchased at the shops). Taxes can be charged as a flat percentage of the cost of goods purchased, or can vary based upon income levels. Because the money gathered in taxation is typically used to fund projects that contribute value to the community as a whole (maintaining sewage infrastructure, building new roads, building power lines, etc.), almost every governing body imposes taxes of one form or another on a segment of their population.

Taxation gets more complicated with scale: with multiple layers of government each setting their own priorities, explicit systems of taxation are needed to ensure each level of government is able to fund their efforts without too forcefully pillaging the earnings of good folk and giving everyone a headache. Even just focusing on the taxation of goods and services (ignoring property, income and payroll taxes, as well as tariffs), complexity is abound; Value Added Taxes (VAT) creates a system that applies the value of a sales tax (a set rate of X%) to all activities that create value. An example is if a baker imports fruit, bakes it into a pie and resells those pies, the VAT they will need to pay will be everything they have earned in the charging of X% (output tax) minus everything they have already paid (input tax), since the rate of X% was imposed in every step of the process from import tariffs to the purchase of other ingredients to the sale of the pie itself.

Nuances exist: some may prefer a sales tax, which simply imposes a tax on the final sale of retail goods, often excluding core essentials such as food and energy. A sales tax is often divined hand-in-hand with an income tax system, as it can be used to piece together government revenues and provide tax breaks where they may otherwise be required. Taxes for importing goods also vary, as they may consist of excises (an indirect tax directly tied to value, often of luxury items) or tariffs (a simple charge for the movement of goods through a political border).

While these differences may seem trivial, it is in these shades and margins of perpetual grey that economies live and die. Free trade zones are created to circumvent the need for tariffs, while customs unions simply seek to impose the same tariffs and quotas on all the members of cet union. For an image of sheer complexity, look to India, which ranked 157th on the global ease of doing business scale as recently as last year. Central, state and municipal governments not only imposed their own dizzying arrays of charges, but also imposed tariffs to cross state lines that led to production hold-ups that would have sent Taiichi Ohno crawling home on his knees in tears. Until, that is, the Modi administration successfully passed a reform for one simple standardized GST (a form of VAT) that was simpler and taxed firms only on the value directly added to a product as opposed to the entire value of the good. The simple act of simplifying the tax code is estimated to boost national growth by 1-2%, and will serve to centralize fiscal policy within India, no small feat when looking to the land of bold colours and bolder castes.

So while there are few odes towards the beauty and bounty of the modern tax system, fret not: it is not the fault of taxes that we all must pay them anymore than it is the fault of death that we must die. Instead, aim to gaze in awe and wonderment at the complexity of modern tax systems and their capacity to fuel or hinder economic growth. And if you focus just hard enough, you may even forget that the interest they ask for has little to anything to do with whether or not the topic interests you at all.

Advertisements

Sophie’s Choice

War illuminates the best and worst of humanity. But even in the most difficult of times, our natural desire for structure and a rules-based order is evident. The Geneva Conventions define the rules of the battlefield, with the final treaty negotiated in 1949, and outlines the basic rights of prisoners, establishes protections for the wounded and sick, and for civilians trapped within a war-zone. A testament to the universality of humanity is that these treaties have been ratified, either fully or conditionally, in 196 countries. Yet these norms are only effective if enforced. And rarely are the norms of warfare enforced when battles see the guerilla tactic of child recruitment as a viable option for victory.

Despite the recruitment of children under 15 being prohibited under both the Geneva Convention and the UN Convention on the Rights of the Child, as well as protections being offered in a myriad of legal and international resolutions, there are at least 250,000 child soldiers currently engaged in conflicts across almost 20 different nations. Because children are vulnerable to intimidation, violence and psychological manipulation, their training undergone consists of physical violence and ideological indoctrination. Infractions are punished and youth are forcibly desensitized to violence. There are countless horror stories of children as young as eight or nine being forced to commit atrocities through fear of death or beatings by commandants, who are themselves often child soldiers whose brutality has seen them rise through the ranks of their militias.

Most international law treats children in war-zones as innocent civilians, with the understanding that children have no place on the front lines of a battlefield. Soldiers who encounter and face child soldiers are given guidelines as to how they should be interrogated, demobilized and helped in any way possible. But this ignores the deeply disturbing possibility that a solider or peacekeeper may be faced with the horrifying decision wherein, faced by a child holding an assault rifle to his chest, he must choose to kill or be killed. In 2000, a group of peacekeepers were taken hostage in Angola by a group of child militants, with one peacekeeper killed and eleven others injured in a rescue attempt. Soldiers who are forced to act often suffer crippling psychological wounds and many have committed suicide rather than continue to be tortured by the moral implications of murdering a brainwashed child. Earlier this year, Canada became the first nation to explicitly outline a protocol for encountering an active militant under the age of fifteen on the battlefield.

In this, Canada is to be lauded for doing what is right – soldiers in combat face impossible decisions and, without rules or orders to fall back on, must act within their own instincts for self-preservation lest they themselves never come home. Creating a guideline ensures that a soldier who must act can hopefully maintain a sense of their own humanity in the face of trauma. The doctrine also goes beyond the point of confrontation, offering a holistic approach to combating child soldiers, and was written in partnership with the Child Soldiers Initiative, an institute founded by former UN peacekeeper and Canadian Senator Gen. Romeo Dallaire. The doctrine states that military intelligence should map out the presence and patterns of child soldiers in the area to avoid conflict whenever possible, and that soldiers entering these war zones should be trained to prepare them for such encounters and psychologically assessed upon their return.

Human rights groups are aware of the sensitivity of this topic and have, for the most part, shown understanding. They view the doctrine as attempting to strike a balance between treating children as innocents, as outlined in international agreements, and recognizing the realities of combat. After all, it is not the fault of the soldier that the rifle aimed at him is being steadied by the hand of a ten year old versus a twenty five year old. By recognizing that the reality of warfare often sees internationally recognized norms thrown out the window in favor of whatever competitive advantage can be sought, governments can go a long way towards creating rules that maintain the humanity of a soldier when every rule and structure fails to prepare them for the horrors of war.

 

The Best of Intentions

To work, for the most part, is to perform a task or manufacture a good that has a monetary value that can be sold for profit. For some, their skills are so valued that it became necessary to work with others to sell a combined service offering. As an example, a blacksmith could see his workload triple as his reputation for quality spreads, and he may need to hire additional blacksmiths to manage increased demand. The original blacksmith, new in his role as manager, could seek to protect himself and his workers from potential litigation from customers. He may also be searching for opportunities to pay less of his earnings in taxes to his feudal lords (or local municipal government). The standard option for someone wishing to develop a structure that separates individual workers from their offerings is to incorporate a separate legal entity – that way, a consumer can purchase the services of the firm instead of the individual worker, and workers can work within a set structure for greater stability and predictability.

But there’s a catch. The newly incorporated Blacksmith Inc. has come under fire from local environmentalists, who claim that the garbage and refuse the company creates is degrading the quality of a nearby river where all the townsfolk of the neighbouring village get their water. They demand that the Blacksmith acts to fix this, as they deem him responsible for the medical bills incurred by drinking contaminated water. The Blacksmith is outraged – in his eyes, his company exists to make enough money so he can feed his family and his employees. It is not his responsibility to pay for expenses for people he has never met. But he also understands that his actions may have inadvertently lead to harm, and wishes to avoid that happening in the future. How can he, a simple business owner, balance his freedoms as an individual and his ability to feed his family with the larger collective good?

Capitalism, the backbone of the market system that allows tradesman such as Blacksmiths to earn a living, has long viewed such questions of common resources as frivolous externalities – companies may generate pollution, contribute to societal inequality and have economic impacts that see people lose their jobs, but their obligation is not to the world at large – it is to themselves and their margins. This traditional abdication of responsibility has become thoroughly un-trendy as of late. Consumers and governments have begun to place pressure on business to evolve their social and environmental practices, and to take an interest in practices in wider issues that have no direct impact on profit margins but may impact the people who work within the organisation. The hope is to promote the more responsible model of what it means to be a corporate citizen.

The key word here is “corporate citizen”: a corporation is, by definition, a separately incorporated personality not associated with a single person, and therefore is not a person. While corporations can, in certain countries, be legally treated as individuals (a corporation may enter contracts, own property, sue and be sued), the entire point is that they are independently managed through a centralised structure. Moreover, the Economist Milton Friedman’s famous quip “a corporations responsibility is to make as much money for the shareholders as possible” is actually entirely legally legitimate. If the actions of a corporation serve to reduce profitability or negatively impacts public perception, it directly impacts a company’s ability to keep their lights on and pay their employees. After all, a closing factory is rarely lauded for all of the carbon emissions they will now not be contributing to the atmospheric total. Instead, they are penalised for their failures to achieve their core mandate: make profits and make jobs.

Critics of modern CSR have long stated that a more efficient system is to pass profits onto employees and shareholders and allow them to take their own actions as they see fit. Laws have been passed in dozens of nations to take it one step further, making changes in the tax code to allow corporations to deduct charitable donations from income tax, thereby incentivizing direct giving. But to claim the impact of an individual can have the same significance as that of a corporation is naive. 71% of global carbon emissions come from just 100 companies – it is silly to believe that everyone turning their lights off will have the same effect as generating action within those corporate entities themselves. Additionally, more and more companies are responding to public pressure and consumers demanding they take action, seeing a business case for themselves in shifting their own operations to be less resource-intensive or more active in supporting members of their community. Ultimately, the business case for these actions does correctly fit their core mandate – to increase profitability by reducing expenses, and to generate revenue and sales through increased brand awareness.

But the issue surrounding corporate responsibility has never been one of what is correct, but what is right. One of Friedman’s lesser known claims was that “In a free economy, there is one social responsibility of business – to use resources and engage in activities designed to increase profits so long as it stays within the rules of the game.” It is not the responsibility of the corporation to act benevolently as a player, but the responsibility of the environment in which it operates to ensure the rules of the game help players. This can be accomplished by aligning long-term objectives and encourage firms to behave in a responsible manner for the long-term success of all parties. This is done through the framework of law – legislation and regulation incentivizing behaviour we deem positive and penalising behaviour we deem detrimental – but also through the components of the corporation itself: any corporate entity is staffed by people, whether they be blacksmiths or otherwise. If businesses are confused as to what exactly their broadened mandate may entail, they need only look to the people within them to set priorities. It is not entities who push change as an agenda, it is people who create it through a mandate.

In our story, Blacksmith Inc. opted to implement strategies to recycle waste and reduce impact, which meant a slightly higher cost incurred in sorting waste. But the towns people were so grateful for his idealism that all of the neighbouring towns now came to him for work as well, meaning his business was positively impacted by his actions. In the real world, things are rarely so simple – but individuals who look at more than profit and are driven to make change maintain the capacity to drive it forward.

We all have an impact. And we are also the only ones who can help make it a meaningful one.

Man Up

One of the more curious aspects of modern Internet culture has been the spread of the masculinism movement, wherein groups come together to preach the gospel of men’s rights and speak out against movements focused on addressing inequity within social groups. The most visible way this manifests is in the “anti-feminist” movement, where members of forums or comments sections speak out in favour of holding both genders to the same legal and cultural standards, often times falling back on unconsciously or overtly sexist rhetoric to reinforce key messaging.

However, when examined in greater depth, an interesting pattern emerges: the majority of individuals who use these discussion forums will overtly claim to be in favour of equality between genders. Exceptions obviously exist, but the discussion condemning feminism often contains a variant on the phrase “Of course men and women should have equal rights”. So if the conversation is not one surrounding why one gender should dominate another, where then does the issue of the supposed inequality of feminism arise?

Gender as a societal construct is well-denoted, particularly in more recent years with the growth of the transgender movement. But, as discussed in a previous post, societal constructs are developed as direct indicators of status within society. And much like being white has always been an indicator of status, being a male has equally been an indicator of superior social standing. Men have always held traditionally dominant roles, a practice explored in greater depths in third-wave feminist theory, which focuses less on the disparity between sexes or genders and more on the manner in which power is assigned across cultures. And if gender is a construct, then being a man is separate from being born a member of the male sex, and can be deconstructed along social and biological lines accordingly.

In simple terms, to be a man is to be masculine, and masculinity in the Western world is defined by three core constructs – physical dominance, wealth and sexual conquests. There are indicators that these are cultural, and not biological, factors, given that they differentiate historically. For example, hegemonic masculinity has been at odds with male homosexuality since the Italian Renaissance, but was readily practised before then and was a cultural norm in earlier European societies. And, due to the role men have played in society, these three core traits were not only associated with being male but also with being powerful – indicators of power and influence became directly tied to the accumulation of wealth and the dominance of others. Myths and legends across the globe are filled with tales of heroes striking down their foes and bedding beautiful women, all while demonising the non-intimidating intellectual as spiteful, cowardly or simply not a real man.

Being male in the Western world of today is also equally nuanced due to the fact that that status is made up of a series of privileges, not a designated set of explicit benefits. But privilege is often non-visible to those who hold it, given that it manifests as a lack of barriers encountered in day to day life. But, oddly enough, the lack of systemic bias against males in a Western culture can lead to difficulties as well. Removal of an individual’s status as a man, due to a lack of sexual conquests or a perceived physical weakness, can result in being exiled from your peer group and not accepted by other surrounding groups. The lack of systemic injustice being ingrained into the cultural DNA of Western males means there is no urge for support in times of exile – to fail is to fail with every advantage, and is a reflection on the lack of capacity of the individual, not of the ingrained bias of the system. This can mean that men who lose their status find they have nowhere to turn, and can become increasingly isolated from social circles before attempting to assert their supposed masculinity in potentially harmful ways (In the last three decades, 97% of school shooters have been male, and 79% have been white, a startling indicator of the isolation felt by young males).

This systemic notion of manhood can also be seen in more mainstream terms in today’s world: men are meant to be breadwinners and the heads of their respective households. But as times evolve, and other minority groups begin to see their own status rise and their definitions evolve in societal terms, men in the Western world have failed to see the same progress. An example: the modern woman can have it all. She can be career-oriented and a mother and can do anything a man can do. While there continues to be barriers and stigma attached, it is no longer societally unaccepted for a woman to be dominant in her personal and professional relationships. However, it remains unacceptable amongst men for a man to be dominated – such a thing sees the revoking of his status as a man and the individual is feminized by his peer group (meaning he loses his status and power associated with being a man).

This fear of being shamed or exiled can create feelings of strong animosity towards grassroots movements of minorities attempting to increase their own social status, thereby seeming to encroach on the relative dominance of men. As the definition of what it means to belong to a minority group in the West has evolved (Black civil rights movements, first and second wave feminism, same sex marriage equality movements), the definition of what it means to be a man has not. Men are still expected, for the most part, to be economic breadwinners and to earn the bulk of their household incomes. Men are also expected to defer child-caring to their spouses. These are norms that are shifting, but they nevertheless remain the current norms of the way society interacts. And while the perception of women who choose to pursue a career instead of remaining at home to care for children is changing positively, the view of a man who sacrifices his career for his children is not evolving at the same pace, resulting a perceived sense of inequality when viewed as who will lose greater status through the undertaking of a certain action.

Studies have shown that with men, the simplest way to change embedded behaviours is to instil a sense of shame. The isolation felt by individuals when their status of manhood is removed is palpable, and has been known to inspire action in more positive directions. This strategy has been used in education systems to encourage young men to stop bullying and be more accepting – the hope is that the lessons learned at a young age are not lost in adolescence as social acceptance grows in importance within the minds of young people, preventing it from being translated into the minds of the next generation of grown men.

There is no debate around whether it is easier to be a man or a woman in the Western world – across the board, men have distinct advantages and still occupy a higher position of status than women do, shown every day through the episodes of sexism in personal and professional environments that men need never encounter. But when deconstructed as a symbol of status, it is easier to understand why men in chat forums would feel animosity towards the opposing gender while still seemingly espousing their legal rights – the appearance of acceptance of empowerment is perhaps the only privilege men have not managed to take for themselves.

The Ivory Coin Pt. 2

This is a sequel to an earlier post titled “The Ivory Coin”, focused on dissecting the dual perspectives of white and non-white residents of the United States. The post sought to outline certain elements of mainstream views, but discussed race as a lens through the presenting of opinion as fact. This post hopes to expand beyond that subjectivity to take a more constructivist approach.

Countries in the Western world have been brought together through a shared set of values and interests, which have been codified through organizations, institutions and norms. While many are regarded amongst democratic countries as bastions and protectors of enlightened views (ie. NATO, the OECD and the World Bank), each has an imbedded duality. NATO may serve to protect American interests in Europe, but some Eastern groups view it as an existential threat. The World Bank may provide a funding mechanism for indebted or developing countries, but it’s reformist and austerity terms are often prescriptive and can form an unwanted dependence. As a duality exists in it’s institutions, so too can it be found in it’s norms. Western cultures have traditionally been pro-democracy, on the leading edge of technological innovation and great creators of the arts. They have also been hell bent on assimilation, religiously monotheistic and since the spread of Europe throughout the world stage, predominantly white.

The role of race in history is as a construct wielded to divide, often with the aim of oppression. From the caricaturization of groups with certain skin colours to the enslavement and oppression of those same peoples, race (the classification of human beings based on physical traits, geography, familial relations or ancestry) has served a useful construct for those seeking to divide. This division gave rise to the development of a norm amongst Western societies, even as they grew more diverse in modern eras: being white was a symbol of status, a form of non-purchasable wealth and power. Although whiteness once overtly offered benefits simply by creating an avenue in which you were not persecuted for your racial identity, the introduction then subsequent success of civil rights and racial equality movements throughout the 20th century sought to eliminate the idea that one race was explicitly superior to another by introducing laws to treat all as equals.

The introduction of these laws across the Western world was a symbol of hope and progression for the freedom of all peoples. But upon their passage, some misconstrued their impact as having closed the issue. An example can be found in the Civil Rights Movement in the United States in the 1950s and 60s. Although the 14th and 15th amendment of the American constitution guaranteed basic civil rights to African Americans, a grassroots struggle emerged to ensure federal and legal recognition of those rights in the eyes of the law. The civil rights movement served to break the pattern that emerged on the community level that continued to separate groups based upon race. In the push back that emerged against the civil rights movement, racism was certainly involved, but it was also a result of what was perceived as a shift in status. While feeling threatened due to a fear of loss of status is not in itself racist, race was a primary factor in dictating how status was assigned at an individual level. When another group appears to be raising their cultural status to match that of another, the latter group will naturally feel threatened.

That sense of loss of relative status can also be found today, in the opposition towards grassroots movements organized by groups that have been traditionally perceived as minorities. When polled by the Economist in 2017, conservative voters from a sampling in the American beltway failed to cite race as a major concern for voters, instead voicing their concerns on issues relating to a loss of status amongst their communities. The clan instinct of sympathising more with those in your community (geographic, racial or otherwise) has manifested in a focus upon dealing with issues plaguing primarily white communities, like the opioid crisis and a lack of created manufacturing jobs. While these issues plague other communities as well, a majority of those spoken to cited a feeling that life had become “unfair”, and viewed the election as an opportunity to greater tip the national focus towards their issues. While this can be dismissed as simple politics, a deeper trend can also be seen in the desire to preserve the status of white individuals relative to other groups, often by denigrating attempts to address institutional bias by emerging grassroots groups.

Race has long been a construct used to assign societal value, heavily skewing in the favour of the historically powerful. But race as a reflection of status shows a more individualistic underbelly, one where a loss of relative status (or privilege) is equated with a loss of absolute position. This is dangerous and a false characterization. It must be understood that the benefit of some is not always to the detriment of others – race relations cannot be summed neatly into game theory, nor can racism be ascribed with certainty as the primary reason why tension plagues non-racially diverse communities. A lack of economic opportunity or job prospects and health crises are universal issues, and addressing them will go a long way towards creating a greater benefit for all parties. But when we only look at how the actions of others affect us, we risk perceiving diversity as a weakness instead of a strength. Hardly a claim that can be made proudly in an enlightened world.

 

The cost of sharing

A popular idiom in Western culture is that possession is 9 / 10ths of the law. While this may hold true for disputing civil or domestic suits about who gets to keep the kettle in divorce proceedings, there is one notable exception: land. In every country across the globe, throughout all of history, issues of land rights have been discussed, questioned, disputed, protested and have typically ended in either a settlement or war. Few topics are as universally inadvisable to bring up in a community discussion as who owns disputed territory, a testament to our similarities across geographical and cultural boundaries. If only we had more in common than wanting to kill each other over bits of grass and dirt.

The fate of humanity has been tied to the land since we first established settlements. Land is used for agriculture, to build houses and resources grown from it or buried within it are extracted and used to build or develop – without the ability to nurture the land, we would quite literally not have evolved past roving groups of nomads. As our societies grew more complex, so did our numbers and our demands – land that was once freely available began to have greater value due to resource richness, proximity to other regions, or size. Societies all offered a similar recourse for remediating disputes over which land was owned by whom – land contested could be ascribed under the guardianship or ownership of a particular individual, group or community, with the idea being that the stewards of that land would then tend to it and they alone would be able to exploit whatever resources lay within their set geographical boundary.

Earlier cultures were less focused on drawing specific  or arbitrary territorial lines upon maps, but groups across the globe (religious, cultural, familial) typically found themselves operating in set territories or regions, and any interaction between groups resulted in rough boundaries needing to be defined to prevent competition for resources and potential escalation into conflict. As societies evolved, and distinct cultures began to interact in greater, more violent ways, land was often taken by force, and the land use rules of whatever culture was victorious were implemented. A typical example of this is the imposition of land rights and land use agreements upon aboriginal cultures throughout the world by Western colonials, wherein treaties and agreements were typically signed and agreed upon without a firm understanding of shared vs individual ownership. Cultures in North America, Australia and Africa all tell similar stories of “placelessness”, wherein they were removed from their traditional territories and forced into environments with different cultures and customs. Some were able to stay on their original lands, often at the expense of safety or freedom. Many of these tales end with the sad stories of assimilation, slavery or extinction.

Today, the issue of land rights has only gotten more complex. Central governance has simplified the codification of land rights into law, but the same issues that plagued original communities exist today, echoes of the flawed systems installed in eras past. Compounding the problem, there are more humans than ever before, and land is growing to become a scarcer and more valuable resource. Agriculture, urban development, construction and industry all require resources and land, painting a picture where we are currently inhabiting upon or using 90% of the world’s land. This resource limitation has lead to increased conflicts, often between the same structures as before: centralized authority (governments) and communities. Specifically, land rights have arisen time and time again in issues of development in countries that bear scars given by recent conflicts.

Take Vietnam and Kenya. Vietnam’s Communist government operates under a system where the government ascribes land-usage rights, but maintains that all land remains property of the state. Vietnam’s current economic growth rate is around 6%, and cities and urban areas are growing rapidly, eating into surrounding farmland in order to satisfy increasing demand. As a central government hoping to avoid conflict, Vietnam must offer recompense to farmers who have lost their land, must host consultations to discuss land-use plans with the public, must offer avenues for legal recourse if civil disputes arise, and need to ensure all land is maintained in a registry to be sure all of this can be tracked and maintained. Protesting farming communities have offered much feedback on this – specifically, that recompense is not matching the value of land, that consultations are often ignored and civil suits dismissed, and that land registries are poorly maintained by corrupt officials, meaning any issues brought up cannot even agree upon who owns what land in the first place. All of this risks disrupting economic momentum and fostering discontent within the populace, seemingly creating the exact problem it set out to avoid.

In Kenya, each of these same issues arise, with an added dimension of dealing with the land claims of indigenous groups who have traditionally occupied territories. More than two thirds of Africa’s land is still rooted in communities and not written down or legally recognized, meaning governments and communities repeatedly clash when the issue arises (one is viewed as conquerors, the other as squatters). Kenya’s government has attempted to address this issue by undergoing a large-scale titling initiative to ensure those who claimed their land could be entitled to the legal rights of ownership. But even while a third of Kenyans can claim to legally own the land on which they reside, insufficient infrastructure and endemic corruption, as well as cultural issues surrounding the rights of women and minority groups, can pose severe threats to the legal implications of enforceability – the registry may end up just being a long list in a back closet if land owners are not able to take meaningful recourse when their rights are infringed upon.

Even in rich countries, where property rights are secure and land can be used as collateral in financing, land rights issues emerge on the community level. Municipal governments in Canada have grand powers regarding their capacity to expropriate property for “municipal purposes”, and land rights issues of indigenous peoples continue to plague communities faced with poverty and an endemic lack of economic opportunity. Greater steps, including the development of land-use registries accessible to all (Blockchain technologies offer interesting opportunities), the further titling of rural properties to prevent government land grabs, and the greater codifying of the rights of communities into land-use charters, should be taken to ensure transparency and fairness for all involved. If not, the risk of impeding legitimately-needed development while squabbling over patches of dirt grows ever greater. Then it doesn’t matter who possess what – we’ll all be stuck squabbling in the mud.

Coming to a Consensus

The 1970s saw the end of much throughout the Western world: the Vietnam War came to an official close, numerous other proxy wars were launched throughout the globe, and mainstream society began to adjust to the countless cultural revolutions of the previous decade. But for all their similarities, the world’s two brightest Liberal superpowers each underwent a series of minor economic crises of identity, with the decade serving as the worst performing for industrialized countries since the Great Depression. Low growth rates and oil shocks in 1973 and 1979 rang through the global order, reverberating to signal change in both the United States and the United Kingdom.

The United States experienced it’s last trade surplus in 1975, and the latter half of the decade witnessing the economy undergo a period of stagflation, in which both inflation and unemployment steadily rose correspondingly. Trust in government was at a low point following the Vietnam War and the Watergate Scandal under President Nixon, and the solutions attempted (broadening the money supply as a fiscal stimulus measure and the phasing out of Nixon-imposed price controls) had mixed impacts in the short-term. Great Britain experienced a parallel slowdown, as Butskellism (an economic strategy favouring public ownership and public/private sector collaboration towards a grand “Industrial Strategy”) saw a welfare state expand to become burdensome, reducing the competitive advantage Britain once held over West Germany and Japan.

The problems of the 70’s were well-analyzed and researched by regional think-tanks and experts, the majority of whom favoured different policies with a few consistent themes: greater liberalization of markets, reductions in taxes and regulation, and increased competition to foster innovation and growth. Each nation saw a champion of this cause rise in the form of Ronald Reagan and Margaret Thatcher, with their own plans and complex legacies that helped give shape to the world of today.

Ronald Reagan was elected on the promise of reducing taxes, promising a return to growth levels that existed in the Industrial Era before FDR’s New Deal. His combination of removing price controls from petroleum, slashing corporate and individual taxes, simplifying the tax code, investment into military spending, increases in government borrowing and lending, and federalization through delegating greater autonomy to the States served to raise spending levels, both individual and institutional, reinvigorate growth and boost employment. His economic policies, named under the sphere of Reaganomics, have been credited with leading to the second longest period of peacetime expansion in the history of the United States.

Margaret Thatcher was carved from a similar mold of conservatism, formed by a belief in free markets and small governments. More than Reaganomics, Thatcherism encapsulated the age in which she dragged the UK. By transitioning the national economic focus from managing unemployment to managing inflation, Thatcher prioritized monetary objectives, warring with unions and industrial strategists to achieve her objectives of reducing the tax burden, reducing state subsidization programs, and exerting control over monetary policy. Thatcher also embodied the toughness with which she governed, often centralizing federal decision-making around herself to deal with crises and bringing together a Britain once thought as ungovernable. Under Thatcher, the inflation rate fell from almost 15% to below 5% and growth rates exploded.

But each leader’s policies of small state governance had disenfranchising impacts as well. Reagan’s policies saw poverty rates rise, with trickle-down economics proving largely ineffective at spurring growth. Thatcher’s policies rose unemployment levels in the UK by over 1.5 million people in her first five years. The international aid programmes of each began to focus upon structural reform, with the IMF and World Bank becoming stronger in their urging for austerity programs in struggling states, which are nothing if not unpopular amongst citizens. Each leader’s brand of social conservatism, military hawkishness and economic libertarianism further polarized voters, and their support of traditional marriage, penalization of criminality and disdain for the needy served to create figures for the mold of Conservatism currently being shaken off today. But even in today’s environment, each leader would find a place: both were tough straight-talkers who utilized populist speech to mobilize supports and justify a values-driven agenda. Reagan’s pursuit of the Soviet Union and Thatcher’s disembowelment of trade unions were fuelled by rhetoric of “welfare chisellers” and scapegoating of an opposition often not capable of speaking for themselves.

The legacies of economic policy are not best measured in quarters, but in decades. Many credit Reaganomics for aiding in facilitating capital allocation to allow for the tech boom of the 1990’s, and Thatcherism saw a return of London as the world’s financial hub, prompting greater economic evolution past the age of manufacturing in Great Britain. Each nation saw their roles change to being greater consumers of goods instead of producers, enriching some and unemploying many. The policy themes in each platform are credited with influencing similar actions taken in both Australia and New Zealand, ensuring that the world’s Anglo-Saxon colonial powers all remained allies on similar footing moving forward.

The economic objectives of each (lowering tax rates and reducing regulation, restraining government spending and noninflationary monetary measures) spurred growth rates and reshaped the economic identity of each nation for years to come. However, their legacies lie not in their success at growth, but in the side-effects enacted by these policies. Milton Freidman, an economic advisor to both cut from the mold of the Chicago School of neoliberalism, admitted that trade deals and deregulation devastated the working poor across the globe. Increased inequality, exacerbated by falling wages and increasing unemployment, are now what defines the economic legacy of these two leaders who each managed to resurrect their countries from struggle. Each was followed by conservatives who doubled down on their models and fuelled further inequality and the push of boom and bust bubbles. And each dismantled systems to lay the groundwork that gave rise to today’s modern populism currently sweeping the US and the UK. Trickle-down indeed.

 

Silent but Deadly

The modern environmental movement owes much of it’s mainstream awareness in the Western world to Rachel Carson’s Silent Spring, an environmental science book that primarily focused upon ascribing the visible loss of biodiversity and environmental degradation to the use of artificial pesticides in agriculture. The book accused chemical companies of spreading misinformation concerning the use of certain compounds, and is attributed as a pivotal turning point in the creation of the US Environmental Protection Agency and the categorization of the chemical compound DDT, once sprayed on crops, as a pesticide not safe for regular consumption.

The movement also brought onto the mainstream conscience a new concept – that our impacts upon the environmental were tangible, attributable and could be traced back to externalities from human activity. This conceptual leap facilitated the adoption into the mainstream the idea that carbon emissions from industrial activity were harming the environment, even if individual people had difficulty ascribing the impacts of this change into their daily lives. But describing a pollutant – defined as a substance introduced by human activity that has a detrimental impact on the surrounding environment – often conjures the images of sea turtles eating plastic bags, landfills of scrap metal and used electronics, and noxious smokestacks belching black smoke over the stockyards in 1900’s Chicago.

It is this variant of pollution – physical – that actually poses a greater risk than many anticipate. Simply understood as the introduction of discarded materials into the environment, the scope is enormous. Studies have shown that in plastic alone, 4-12 million tonnes are dumped into the ocean every year, with concentrations currently found as high as 580,000 pieces of plastic per square kilometre of ocean. Exact figures are notoriously hard to come by when calculating dumping, as it is a practice conducted by individuals, groups, industries, organizations and governments in every single coastal nation on Earth. But the average annual wastage of 8 million pounds tonnes (over 17.6 trillion pounds) has the “triple threat” impact of interfering with the feeding patterns of animals, increasing the likelihood of entanglement and drowning for marine life, and leaching toxins into the environmental as degradation occurs over thousands of years.

Let’s start with plastic: Plastic production has doubled every 11 years since mass production began in the 1950s, with corresponding increases in the concentrations found in the marine environment. Since the 1970s, trends have traced that over half of all seabird species are in decline, with a 2015 study in the National Academy of Sciences estimating that over 90% of all seabirds are now contaminated with plastic, with figures estimating that the number will increase to 99% by 2050. Most startlingly for humans, the areas with the highest concentrations were not the expected regions, such as the infamous Great Pacific garbage patch currently making up as much as 8% of the entire Pacific Ocean, but rather in Southern coastal regions where ecological diversity of species is the highest. These findings show that areas once thought as pristine have an elevated risk, and with one in six humans depending primarily upon fish as the protein source in their diets, the trophic repercussions of excess plastics could see the degradation of 16% of the global food supply faster than expected.

Plastic is far from the only problem: mass consumption of electronics, along with an average replacement rate of 3-5 years, has seen disposal of electronics and manufactured goods skyrocket. Once disposed of, electronic goods can be recycled – the United States of America claims to recycle 15% of all electronic goods disposed of nationally. But with recycling programs for used electronics only introduced 70 years after commercial popularization of the goods themselves and the EPA estimating that over 2.17 million tonnes of electronics were disposed of without being recycled in 2015, that is a far cry from being the norm. Additionally, a 2015 UN report found that over 90% of global electronic waste (approximately 41 million tonnes) is illegally dumped annually by developed nations into developing nations, costing $52M USD and bypassing the recycling market entirely.

The danger here is twofold: electronics do not degrade in natural environments, creating vast landfills of e-waste, and complex machinery leaches chemicals into the environment, including volatile compounds and heavy metals. Lead, cadmium and mercury have been found in excess quantities harmful to human health in e-waste dumps. And unlike developed nations, where worker safety regulations exist to protect employees from contact with harmful substances, e-waste dumps are manned by recyclers, often children, who burn plastics to access valuable copper and rare earth minerals contained in circuitry and hardware. A similar problem exists with the disposal of scrap metal and used appliances. High levels of zinc, cadmium, mercury and copper were found in yards mined by children, working gloveless and inhaling the fumes from stacks of burning garbage.

Nations have taken steps towards regulating and penalizing the dumping of physical pollution, which has served to change some behaviour. Recycling rates for electronics are increasing year-over-year for both consumers and industry-members. Technologies focused on improving recycling methods improve annually, and the global recycling industry for electronics alone is now valued at $410B USD. But the indication as to whether to panic can be best informed by Canadian scientists in 2015 claiming that marine plastics are “the new DDT”, posing the same level of systemic risk that the chemical compound once did. One can only hope that such warnings inspire the same level of action that Carson drew with her plea for a better world.

 

Blog at WordPress.com.

Up ↑