Connect with us

Features

Only Speaking Professionally | Play for Free

Published

 on

I like the idea of content creators getting money for their creations. I like the idea of players being able to experience that content for free. Unfortunately, for these two positions to exist, the money has to come from somewhere. Alternative one: we either have creators starving because they don’t charge for their games, which is ridiculous. Alternative two: we have players paying a fee to play games – people buy the game, and, in an oversaturated mobile market with very little content control, players sometimes lose money on uninformed purchases.

Neither of the above alternatives are ideal.

There are three more alternatives, and these all delve into the realm of what is broadly as “free to play”. Advertisers pay creators to place adverts in the game, effectively selling private data to people who can use that data to target them for marketing, or sell them products directly. Creators release demos of the product, before allowing for purchase after a set period with the game. Or, creators release their entire game for free with the option of purchasing additional content piecemeal.

Of all of these options, the most morally fraught is the last one – microtransactions.

Microtransactions have a whole lot of potential to enable players to experience content for relatively low amounts, while also allowing creators to receive income. The issue, naturally, is in how these payments are implemented in game.

The most consumer friendly implementations are undoubtedly those that leave core gameplay untouched by microtransactions. Games like Team Fortress 2 that let you play the entire game, completely unobstructed, entirely for free embrace microtransactions for purely cosmetic items. Buying a hat for small change makes zero gameplay difference, apart from looking pretty/silly. Sure, a bunch of people won’t buy anything, but some will, and with a userbase that can instantly expand due to the game being entirely free, someone’s making the cash it costs to provide the constant updates.

Other games play mostly normally, but provide microtransactions as a form of expedited unlock system. Tribes Ascend, for example, allows for normal play, gaining experience to unlock different weapons, items, and classes. While experience gain is perhaps a little slower than it might have been in a non-free to play game, it’s still entirely possible to obtain everything given enough time and play. Purchasing gold will instantly enable unlocks, cutting down the grind. Again, the developers get money from the people who want to purchase their desired loadouts instantly, and provide a free experience to a potential install base.

But the worst type of microtransaction is the one that is required for game progression. A game that gates off gameplay through timers of some sort. These can be literal timers, counting down until you are able to take actions. Or, they can be disguised as commodities, which are earned over time. Perhaps the most infamous example is Smurfs’ Village’s Smurfberries. Smurfberries generate over time, and are used to purchase things in game. However, you can buy Smurfberries for real cash instantly opening up gameplay. Without spending money, the base game limits the actual options available to players, meaning people either have to wait extended amounts of time before playing, or spend real money on in game items. It is, in essence, pay to play.

Needless to say, gauging the player with constant microtransactions to facilitate “normal” gameplay is not a great way to handle the funding model.

And now, we have Dungeon Keeper.

The “remake” of the classic PC game, Dungeon Keeper falls into the last of the three models. Buying Gems with real money instantly enables gameplay, which otherwise would take days of time to wait for. Naturally, critics and fans slammed this move, labelling it as greedy and detrimental to the core game.

But it asks an interesting question – how do you do microtransactions right?

For multiplayer games, the answer is easy – either provide cosmetic items, or unlock boosts. Microtransactions seem especially difficult to implement in single player games, though, and doubly so on mobile platforms. After all, a fancy hat is useless if you have nobody else to show it off to. And, with the expected low price of a mobile release, charging a nominal fee instead of a microtransaction is often perceived as detrimental, automatically eliminating a wide userbase used to free games. So we inevitably get the timer control model – in game commodities that can be earned given time, gating off gameplay to arbitrary extents – some worse than others.

So how do we fix it for single player games? If I knew I’d probably be a millionaire. Since I am obviously not a millionaire, I’ll leave it to someone smarter than me to answer that question.

The bottom line is, free to play is not inherently bad. There are some fantastic free to play games that do not exploit hapless players with excessive or forced microtransactions. There are also terribly exploitative and cynical implementations of microtransactions that should be avoided like Dungeon Keeper. The free to play model is a potentially lucrative and ethical way of feeding developers and providing gamers with entertainment. The real trick is walking that fine line.

Former Editor in Chief of OnlySP. A guy who writes things about stuff, apparently. Recovering linguist, blue pencil surgeon, and professional bishie sparkler. In between finding the latest news, reviewing PC games, and generally being a grumpy bossyboots, he likes to watch way too much Judge Judy. He perhaps has too much spare time on his hands. Based in Sydney, Australia. Follow him on twitter @lawksland.

E3 2019

The History of E3: Looking Back on Gaming’s Biggest Event

Published

 on

Every year, developers from around the world gather in June to showcase their most secret and anticipated projects. In the months leading up to E3, gamers witness the spectacle of influencers and industry veterans discussing the rumors of what might be, further fueling their desired announcements come to life. In the spirit of fun and excitement, E3 allows for the passion of gaming to be broadcast on a world stage and recognized for its influence on the entertainment industry.

Now that the industry is approaching the eve of E3, OnlySP is counting down the days remaining in a segment we like to call ‘12 Days of E3’. Please join OnlySP in celebrating an event that can be described as Christmas for Gamers, as we come together in anticipation for E3 2019!


The Electronic Entertainment Expo (E3) has a relatively short history but has quickly become the gaming industry’s biggest event with inextricable ties to video game culture. The show has become the main stage for platformers and AAA developers to show off their newest projects causing mass hype across the gaming community.

E3 first debuted in 1995, and helped put the video game industry onto the world stage.

Through all of the hyper-charged excitement and cringe-inducing stumbles, the expo remains gaming’s most anticipated annual event. Over the past 24 years, E3 has evolved and adapted to fit the video game industry and culture surrounding it.

E3 1995 PlayStation

1995–2002:

Video games have been around for decades, but even in 1991 many people still did not take the industry seriously. In fact, Tom Kalinske, CEO of Sega America from 1990 to 1996, details just how casual the attitude was towards gaming.

“Back in the early 1990s we always used to show at [Consumer Electronics Show] in Las Vegas. We were there alongside the guys that were showing their new automotive speakers, or their new computing systems, or TVs, or telephones… In 1991 they put us in a tent, and you had to walk past all the […] vendors to find us, to find Nintendo and ourselves and the third party licensees.

He continued:

“I was just furious with the way that CES treated the video games industry, and I felt we were a more important industry than they were giving us credit for. So I started planning to get the hell out of CES.”

In the 1990s, video games were largely considered to be just toys, mainly in part to Nintendo’s marketing strategy at the time. Nintendo targeted the younger demographic which forced competitors to seek out alternative audiences.

Games such as Myst and Mortal Kombat appealed to an older audience, with the latter also benefiting from a movie of the same name in 1995. These titles were perhaps too successful in capturing the attention of adults because many people expressed issues with the blood and violence.

Therefore, fearing government oversight, game publishers created the Interactive Digital Software Association (ISDA). The ISDA then became the Entertainment Software Association (ESA), a united bloc representing the games industry. The ISDA proposed the ESRB, which standardised age ratings for games based on the existing ratings for movies. The United States Congress approved the ESRB and allowed the gaming industry to continue.

New advances in technology helped the home-PC market to grow as well as widespread use of 3D graphics which in-turn provided gaming new opportunities to branch out. In 1993, Sony began developing the PlayStation: a giant leap forward for gaming, as Sony was highly regarded as a reliable electronics company.

The first E3 in 1995 was an immense success registering over 40,000 attendees (see video below of E3’s first expo). At this time, the console wars were in full swing and everyone was scrambling for a piece of the pie.

Games from 1995 onwards drove the industry to focus on game presentation. Titles like Tomb Raider and Resident Evil pushed technology to new boundaries, exploring new genres, cinematic storytelling and visual masterpieces.

The industry had moved into the beginning of modern gaming.

2002–2009:

By the time the sixth generation of consoles had rolled out, gaming was huge, but the platformers had narrowed to the big three—Sony, Microsoft, and Nintendo. By 2002, these three companies dominated at E3 for the next eight years.

2005 marked the first time E3 gained media coverage by G4 television networks, and E3’s attendance also soared reaching 70,000. The hype surrounding the expo would only grow, as 2005 saw the announcement of the PlayStation 3 and Xbox 360.

However, the E3 expo quickly became a victim of its own success. The ESA scaled down the conference because exhibitors felt it had become too difficult to reach their target audience due to the overwhelming growth of gaming media.

As a result, from 2007 to 2008, E3 was rebranded to ‘E3 Media & Business Summit’. Attendance was limited to 10,000 people. In a bit of irony, this move damaged E3 as media coverage became severely limited.

E3 had become a largely corporate event and whilst the Media & Business Summit was far more manageable, it nearly became the engine of its own downfall. The ESA realised that bloggers, journalists, and personalities drove the expo’s momentum and hype.

E3 2009 Sony

2009–2018:

In 2009, ESA rebranded again, but back to the more familiar and catchy E3 and opened its doors once again to a wider audience, drawing 41,000 attendees. The following year saw big game developers presenting alongside the big three platformers for the first time. Ubisoft, Konami, and EA contributed their gaming products helping to further expand E3’s reach.

In 2017, E3 opened up its doors to the public for the first time allowing fans to attend talks, meet representatives of companies, and play samples of upcoming new games. Furthermore, with streaming services and online platforms growing in popularity, E3’s popularity continued to grow. E3 was also able to stream live content out to millions of people around the globe who could not attend.

Once again, E3’s success also meant that some companies struggled to reach desired audiences, especially with so much competition. Nintendo was the first big name to depart from E3 to begin showcasing its games through Nintendo Direct. Electronic Arts (EA) then followed by launching their own presentations with EA Play.

E3 has also been pivotal in supporting indie games by increasing coverage; last year saw big-name publishers such as Microsoft and Bethesda feature indies as part of their mainstage lineup.

E3 2018

To 2019 and Beyond:

E3 has firmly forged its position as gaming’s most prestigious event and this year’s event will start on 11 June. The show is sure to bring in huge crowds once again with exciting AAA and indie games announced for 2019–2020 releases, with fans and media desperate for more information on these anticipated new titles.

However, E3 can sometimes have an unfortunate effect on game developers by ramping-up hype about the games it showcases. At E3 2018, Anthem was one of the most highly anticipated games, but ultimately failed to follow through on the high expectations when it was released in February this year.

Perhaps games such as Anthem would have gone down like a lead balloon anyway, but maybe E3 raised expectations far beyond what Anthem was capable of.

Furthermore, Sony shocked the games industry by announcing that it was not going to attend E3 2019, and then unconventionally announced details of the Playstation 5 in an interview with Wired.

These recent moves ultimately beg the question: is E3 even necessary anymore?

In its current form, E3 seems to be paying homage to the past—a time before self-made developers and 24/7 gaming coverage on streaming platforms like Twitch and YouTube. Whether E3 will adapt to the ever-changing gaming landscape, however, remains to be seen.

To see more from our 12 Days of E3, be sure to follow OnlySP on Facebook, Twitter, and YouTube. Also, be sure to join the discussion in the community Discord server.

Continue Reading