(image source)

Interactivity is a welcome by-product of the internet revolution, with a media no longer one-sided, audiences are enabled to take a more active role in news presented to them. Recoiling from the outmoded top-down approach of communication, comment sections are enabled with the intention to engage readers in a meaningful exchange of ideas. Though as history has so kindly demonstrated, giving a voice to unhinged individuals is known to end disastrously.

Upon reading an interesting article in which pediatricians endorse gay marriage, I endeavoured to leave a well-thought out comment detailing the social and political reasons for which I agreed with the view presented. But an onslaught of abusive comments, echoing Alcuin’s “madness of the crowds” theories, changed the swing of things. As with any human being in possession of a sound mind and a soul, distasteful homophobic remarks tend to strike a chord with me. I replied to a comment, and a wildly infuriating virtual battle ensued with two individuals, (one with the screen name of SaggingBellyFat, a sure indication of the level of intellect we are dealing with) that saw me on the receiving end of some conservative American Christian propaganda, where I was accused of being enslaved as ‘Satan’s puppet’. Of course, assuming the only tactic appropriate when arguing with imbeciles, I made personal attacks straight back. Entertaining, yes, but constructive? Probably not.

Trolling is a side-effect of online democracy so prevalent, that WordPress has published a guide on how to extinguish them. Though by blocking comments, two seemingly equal values are conflicted; democracy and general human decency. Though the reasons upon which I would suggest that such comments remain unfiltered is mostly for that of archive; in the same way we chronicle publications across time, we are now presented with an even wider insight into public debate. In the wise words of Nick Couldry, we all  ‘have the capacity to give an account…that is reflexive, continuous and an ongoing embodied process of reflection.’ For as sad and gruesome as some remarks can be, they are indeed a reflection of humanity today. What does need to be moderated, however, are levels of accountability. Its basic sociology that anonymity breeds hatred. How easy it is, to assume the wildest of personas when in disguise. Publicize your heinous derogatory beliefs, by all means, but do leave your full name and contact email and declare full liability of any repercussions. Also so that in generations to come, everybody knows who they are laughing at.

The point here to remember is that as terrible as a voice may be, they are still a voice, and a part of our world today. Other peoples views are interesting, and providing a platform for all to be heard is crucial for a multifaceted autonomous society. Cowards with pseudo screen names or those masquerading under anonymous, however, should not be a part of the debate.

(SaggingBellyFat, I am looking at you)


Ten years ago, the Australian Government estimated that one in five Australians are living with an impairment. Since then, the world has witnessed rapid development in technologies that have the potential to cease exclusion for our impaired community members. Thought shockingly, disability remains an afterthought so often associated with charity and burden. Gerard Gogin of the University of Sydney askes, ‘If we are now possessed of greater knowledge about disability and design, why is accessible and inclusive technology so difficult to bring about?’. The UN Convention of Rights for People with Disabilities (2006) promotes access to technology for all users, including those with impairments. A year after this convention, the first Apple iPhone was released, and as a completely touch-screen device with no raised buttons, voice activated commands or alternate pointing devices, and was rendered almost completely inaccessible for those with vision impairment. It was a sad example of people with impairments being completely built out of a technology. The good news is, that now things are changing, and members from the blind community are touting current versions of the iPhone as ‘amazing’ and ‘life changing’, with accessibility being consciously built back into the phone, as well as taking things further with apps that can even identify colors, and read their descriptions aloud. Though, according to Goggin, Australians with impairments are still underrepresented in the take-up and use of mobile technologies, and people with disabilities continue to feel excluded. Recently, Apple’s main competitor, Samsung, attempted to have Apples Voice-over function removed from all its phones, claiming it to be a patent infringement, a decision that would leave thousands of blind iPhone users unable to use their phones anymore. This is an example that illustrates clearly how attempts for accessibility seem more closely linked with profit and PR than genuine concern to include the perspectives of people with impairments into design. Though here in Australia, there is reason to celebrate, our lead telecommunications brand Telstra have long focused on technologies allowing access to all users, and the ACIF has established a Disability Advisory Body in which people with impairments serve as experts in the development of Australian communication systems. Clearly, the technologies exist to create such accessibility, it is simply a case of a shift in the way we think, from considering those with impairments as integral to the development of software, as opposed to an extra audience to cater for down the road.

There has been ongoing debate as to whether Australian television shows manage to represent our diverse cultural landscape. The media play an intrinsic role in influencing public perceptions of race, so it’s clear that a multicultural society should have a multicultural media. Throughout this debate there have been numerous references to the ‘whiteness’ of shows such as Neighbours and Home & Away, and criticism of shows such as Underbelly, for casting Middle-Eastern actors to play the bad guys. Whilst I agree that all too often, the foreign actor is cast as the criminal as opposed to the doctor, it is worth remembering, that the show Underbelly was based on a true story, and in the actual story, the ‘bad guys’ were of Middle Eastern descent, so perhaps it is time to hesitate that particular example.

It would also seem that diversity schemes are not the answer. Instating quotas for ethnic minorities in the media, and handing out roles to non-white Australians purely for point of difference can ultimately can result in even more tokenism, and prioritizing political correctness over talent. Instead we do need to think clearly about how different roles are distributed, moving past the ‘stereotypical, caricatured roles such as the wog criminal’ and the general “baddies”  and casting people of different ethnicities into those everyday roles such as doctors and police officers. Recently, Neighbours has  introduced an Indian-Australian family (though sadly the online response was so flooded with racist slurs so ubiquitous through our nation that the network had to go on a comment-deleting spree) My Kitchen Rules had a similar story.

There is still a long way to go, and we do need to find more creative ways of celebrating our vibrant mix of cultures in our mainstream media. Though it is worth acknowledging the small triumphs so far. Over the past decade we have seen the award-winning Woolworths advert featuring Greek couple Maria and Stavros. Shot completely in Greek, casting a playful nod to those (stereotypical, though in a loving way) Greek grandparents we all wish we had. We have borne witness to television triumphs such as Salam Cafe, which encapsulates the shift in representation of Australian Muslims’ by making light of mainstream misrepresentations of Islamic culture and reducing racial sentiments of ‘the other’. Shows such as The Straights, The Slap, The Message Stick and East West 101 also portray a diverse Australia, paving the way for a multicultural media in the future. While we aren’t fully there yet, but it does seem that we are making baby steps.

Walled Garden; an attractive environment designed to keep the captive reasonably satisfied, and requiring some cost in escaping from it.
In the computing world, this refers to centralized software systems that heavily moderate access and content. Though there is discussion at large that walled gardens pose huge limitations on openness, hindering innovation in the digital world and act as a direct threat to the democratisation of the internet. Google co-founder Sergey Brin admits that he has ‘never been so worried’ about the future of open internet, with internet freedom being endangered by a ‘combination of governments trying to control access for citizens.’Mega brand Apple has been accused of having such a desire for control, In the world of Apple, such walled gardens offers a relative calm for the consumer amid their own strictly regulated ecosystem of networking. Apple focuses largely on simplifying programs, providing ongoing customer support and a stable user environment largely free of viruses and malicious software. Apples marketing campaigns clearly attempt to diffuse such suggestions that a rigorous screening program for apps leave consumers without something they need, reassuring them constantly that ‘there’s an app for that’. The much touted negatives of walled gardens are clear, limited functionality, locked-in platforms, patenting that hinders creativity, and a widespread fear of control.

Though both sides must be weighed up. Tim Berners Lee, the inventor of the internet, is an outspoken supporter of an open internet stated “that the greater ability of small companies to innovate meant it was unlikely that the current web giants would maintain their dominance indefinitely.” Basically, once upon a time, the world was threatened by the closed-system of Netscape. Then Microsoft became the new scary government prototype. Now it’s Apple. In other words, people are not so trapped in their beautiful walled garden to recognise innovation that lies on the other side. iOS, the most innovative system in decades is a prototype of a walled garden. Ultimately, walled gardens have their place. They serve a clear purpose to streamline the chaotic internet world and provide a usable service that has allowed access to the digital world for those who thought it unimaginable. Ironically, walled gardens seem to do more to create access than hinder it.

It is argued that the rise in for-profit online courses is democratizing learning, and creating ‘accessibly, quality education for everyone who wants it’.  But there is a disturbing trend of online universities which are running merely for profit, which pride themselves on developing ‘courses like products’, outsourcing staff to cut costs and developing such a standardised degree that it can be completed as cheaply as possibly, and resold each year to a fresh new set of undergrads.

For universities to assume a total business management model is to miss the point of tertiary education entirely. We live in an age of academic inflation; to put it quite simply, when it comes to getting hired, the same level education won’t get you as far as it once would. More people are graduating from university than ever before. The same employers who were once thrilled at a Bachelor degree, might now consider an an Honors inadequate. So basically, if you’re only going to university for that piece of paper you are wasting your time.  Universities need to be able to offer more than just something to write on your resume, they need to create a space which nurtures creativity and innovation as well as equip students to think and communicate with the most powerful medium of our time. To standardise a degree is to hinder creativity and to ignore the individual. Standardised models of education do not allow humans to flourish, we need an education system that is personal, that allows individuals to develop their own solutions based on external support. Creating courses merely to extract economic benefits from people is to dislocate students from their natural abilities and create a focus on individual. We need courses that adapt to the individuals who attempt them, we need courses that challenge students and that push them

Why focus solely on the vocational, when the qualification has practically  been rendered inadequate? If all universities take such a ‘institutionally-focused pragmatism’, then surely students will adopt a similarly disinterested approach, and the result will be nothing more than a cohort of disinterested young adults with worthless transcripts. We’re creating a generation of students with only one bottom line in mind; money.  What will this mean for the future of academia? Well, if the focus on nothing more than pleasing the customer, there probably won’t be much of a future. Realistically, the current teaching generation will one day be dead, and running universities will be these students from these stock-market listed atrocities with little intention to contribute to public knowledge or political discourse.

And what about open access? The internet is the most powerful communication force that we have seen to date. We must utilise the internet to harness creativity, not hinder it. We need to use it to disseminate knowledge in a way that benefits society as a whole, not corporate greed. We need to use it to inspire invention and improvisation – to utilise online publishing in a way that previous generations could only dream of. Society thrives upon a diversity of talent. Diversity is not going to materialize through turning out cookie-cutter courses faster than the cheques can be cashed.

‘The dogmas of the quiet past are inadequate to the stormy present. The occasion is piled high with difficulty, and we must rise with the occasion. As our case is new, so we must think anew and act anew” – Abraham Lincoln

https://i1.wp.com/www.csmonitor.com/var/archive/storage/images/media/images/elise-bauer-food-blogger-simplyrecipes.com/10186341-1-eng-US/Elise-Bauer-food-blogger-simplyrecipes.com_full_600.jpg(image source)

We hear that print media is dying; bleeding into the online world of citizen journalism that will eventually supersede it. Yet shockingly, amid the panic, cookbook sales are on the rise. Particularly those from food bloggers. We live in an age where for the first time ever, individuals are able to establish a writing career through self-publishing – the traditional barriers of entering the print media world seemingly gone for good. Though the startling irony is that most of these success stories will inevitably end in a book deal. Back to print media. Back to the traditional; a full circle. Look at Katie Quinn Davies, the Sydney food blogger who recently had her book published in four languages. There are many , many very similar stories.

Rather than replacing traditional journalism, such participatory media can serve as a facilitator for entering traditional editorial style journalism.  Food blogs might take the hard work out of  searching for a recipe, but perhaps won’t seem as credible, and perhaps won’t offer the same high standards of consistency in language and style achieved through vigorous editing. There is something to be said for something that has been slowly perfected, something that is finite. People can tell if something is written by a professional or an amateur.  In a fast moving world where online content can be edited and re-edited again, readers can take solace in something that isn’t going to change again. It is what it is. Perhaps this is a passing phase, perhaps this soaring of cookbook sales is merely a by-product of this so-called age of gastronomy, or perhaps, nothing will ever replace a shelf of thick, well-worn, dog-eared cookbooks that you can hold in your hands, splatter gravy upon and hand down to your daughters. And yet for now, rather than replacing traditional media, we see citizen journalism, at least in the area of food, complement it, and enrich our media environment, that is now multifaceted as much as it is democratic.

(Image Source, here)

Across Australia’s media landscape, everybody seems to have something to say about the Finklestein Inquiry, and unsurprisingly, not everybody agrees. Whilst on one hand, media academics praise the inquiry for attempting to improve media accountability and increase transparency,  on the other hand, senior journalists herald the contrary, suggesting that academics only support free speech that they themselves agree with. A notable example is editor in chief at The Australian; Chris Mitchell, who goes as far to describe media academics as being ‘far removed’ and intent on ‘infect[ing] people with progressive left ideology’. That’s a big call.

There is clearly a rift between the media academic community and practicing journalists and the real question is who is more qualified to make judgments about regulation?

The immediate response of course would be those in with experience in the industry themselves, but then again look how far self regulation has got journalism in the past. Clearly we need to consider the strategic alignment journalists have with advertisers, and the political agendas of media outlets themselves. Which, of course are seemingly inevitable from such a concentrated media ownership in Australia.  Standards of journalistic integrity are seemingly lost in a strange domain where anything goes in the pursuit of selling stories, though the good news is that people are noticing.

Though solving this problem can’t be left to either side of the field, if the media is going to move forward in a way that allows free speech and media independence to mingle harmoniously with journalistic integrity; and thus the two sides must unite. Journalists and media researchers both play a crucial role in policy reform (and ulterior motives will always be found on both sides of the fence.) Now, at the crux of modern journalism lays a debate applicable to both academics and journalists, and if both sides aren’t working together, will Finkelstein become nothing but rhetoric? And will this looming new media body simply become just another player in an already heated game?