
A new dawn for cloud
Google is targeting online gambling operators with its public cloud service, promising the enhanced user experience offered by the tech behemoth’s most renowned products. The firm's cloud platform business manager Nico Gaviola explains


Cloud services are a hotly debated topic within the online gambling realm. Younger, more nimble and start-up driven companies are much more inclined to recognise the benefits of operating through a third party cloud system, particularly one that offers heightened security, lower costs and the promise of less internal manpower. However, there are still sceptics that are quick to highlight the inevitable uncertainties of entrusting a third party with your entire technical set-up.
Undoubtedly, this is the reason behind many operators choosing to host their own private cloud services. In the wider tech industry, cloud computing companies are booming, with some having reported healthy revenue growth in recent half-year earnings. However, the ongoing regulatory complexities of data sharing and the seismic shift of technology infrastructures are still blinking red lights on the dashboard of the online gambling industry.
Google Cloud’s platform business manager Nico Gaviola believes that in spite of this, gambling operators are increasingly turning to the technology behemoth’s public cloud set-up to host their tech stacks. Gaviola pinpoints a handful of prominent cloud features that have piqued the industry’s interest, including scalability, machine learning algorithms, big data and improved user experience.
Like most endeavours Google embarks on, its cloud infrastructure is a huge operation with five new data centres added in the last 12 months, and a $29bn investment in further development of the infrastructure over the next three years. “Public cloud is the next evolution,” Gaviola tells EGR Technology.
“Instead of servicing your own computing infrastructure, enterprises just rent them from large-scale computing companies like Google, Amazon or Microsoft. The difference is that in our case what we’re providing in terms of the data centre is completely virtualised, so instead of having machines attributed to specific customers we are able to separate customers in a software defined way and then customers just pay for what they use.”
Transitioning to the cloud is a long-winded process requiring the utmost attention to detail. The extent of migration is often the primary worry for operators, particularly when considering the public cloud. To remedy this, Gaviola recommends visualising exactly how you expect the outcome of the project to unfold.
“Something that Google uses as a bit of a mantra is ‘start with the user’,” he professes. “Once that’s defined and you start investigating cloud services, you should make sure you’re working with a provider that understands your goals and objectives and can provide recommendations and showcase the technical depth for which services are the best to use.”
Gaviola advises operators to consider a provider that offers a consultative approach and an in-depth technical team. “The key initial considerations for companies are is the infrastructure here? Is it private? Is it scalable?” Gaviola divulges.
Beyond that, operators face the challenge of making design considerations about how their applications run with traditional infrastructure and how to switch it to suit the cloud architecture. With this comes cost considerations, an issue which can lead to operators underestimating how much the cloud infrastructure will set them back. While the option may appear cost effective, once design considerations are properly mapped out, the option could prove to be more expensive than initially anticipated.
Migration anticipation
Gaviola presents three possible migration strategies, each dependent on an operator’s current set-up. In order to shift data from a legacy platform onto a cloud system, without majorly disrupting the workload, public cloud companies can “lift and shift” an operator’s infrastructure by mimicking the workload in the cloud. “It’s generally not the most cost effective but it’s the fastest and least complex because you’re minimising change,” Gaviola notes.
Another possible migration option is applying a slight redesign of the architecture. Google offers an open source framework called Kubernetes that helps containerise information for increased efficiency during migration, and also allows for an infrastructure that is both in a co-location environment and on the cloud.
Ethically-driven cosmetics company Lush has revealed it is developing a virtual shopping assistant and legacy mobile app through the Google Cloud platform.
Dubbed Lush Lens, the app will enable customers to use their smartphone cameras to identify Lush products via Google’s image recognition API. The app will offer information including ingredients and the product’s purpose.
Lush Concierge will act as a virtual assistant for both customers and staff to enquire about the nearest Lush location, whether certain items are in stock, and how many units of an item have been sold. Speaking to Cloud Pro, the firm said it had migrated its entire ecommerce site to the Google Cloud Platform in 22 days.
The final option is a complete redesign of the application using cloud native services. In such a situation, Google offers an App Engine service designed to run applications. Additionally, the firm’s long-standing investment in intelligent infrastructure, particularly in powering services such as Maps, YouTube, Search and Android, affords its cloud clients the same scale of performance as other Google applications. But Gaviola claims what really sets the firm’s cloud offering apart is its dedication to meet each client’s needs. “Google operates global services at scale while providing great user experience. There’s a lot of expertise that we can provide when they look to move their applications to the cloud.”
The shroud of uncertainty that once attached itself to the cloud is beginning to dissipate, and the sun is finally starting to shine through. In the last 18 months, cloud offerings are believed to be operating at a higher level of security, privacy and scalability than the majority of gambling operators that maintain their own in-house IT structure, Gaviola claims. It appears to be a new dawn for cloud. “The reason why they are looking at cloud providers is for agility and innovation, so when you offer your own infrastructure what you tend to find is it slows down deployment.”
“The key initial considerations for companies are is the infrastructure here? Is it private? Is it scalable?”
Operators are increasingly placing more and more importance on the application of big data in developing personalised user experiences, and interestingly, sitting on a cloud could advance the role of user experience further than building on top of a legacy platform. Gaviola says the cloud allows operators to free up time and drive personalisation efforts. “We’re seeing many operators start to use cloud to run their data platforms as the first step in their journey.”
Google also places high importance on user experience, and has developed an AI platform for operators to run and train their own machine learning algorithms to boost personalisation and overall user experience, through tracking historical user data and making a prediction based on Google’s model. “We’ve designed chips called Tensor Processing Units (TPUs) that are becoming available to enterprises to use to train their own model. Once your own model has been trained, you need to be able to apply it on a go forward basis. Essentially it’s a one year prediction on your service,” Gaviola explains.
Additionally open APIs are offered via models that are pre-trained by Google, including vision and speech APIs used in image and language detection – Google Images and Translate. “Today Google has designed over 2,000 different types of machine learning experiences that are used in production. Along the way it has pioneered the fundamental technology that has enabled these models to work.”
Another key word often thrown into the cloud discussion is ‘scalability’, a hot topic for gambling operators which are consistently facing massive spikes in traffic, and often find themselves scaling up and down their businesses. Gaviola says operators are confident Google has infrastructure scale to be able to absorb these spikes. He uses the example of Pokémon GO, which launched on the Google Cloud in September 2016 and quickly shot to global fame to be played by smartphone users across the world. “Increasingly operators are trusting public clouds with the fact that they can scale to whatever big sporting event they may have,” Gaviola says.
Regulatory rainclouds
However, one ominous shadow still lingers above the cloud; the impact of the General Data Protection Regulation (GDPR) and the complex nature of meeting ever-changing online gambling regulations. “There are certain parts of the infrastructure, particularly some of the settlements of the transactions that need to happen in certain legal jurisdictions and that requires some of the infrastructure to be physically located in those areas,” Gaviola adds.
“There are some challenges to be able to meet that because you need to design a hybrid architecture between what needs to sit in those regions and the parts that can exist in the cloud.” In the case of GDPR it’s up to the cloud provider to ensure it is in a position to meet the regulatory requirements for data processing. Operators will still be considered data controllers, as they are in control of their data. In some respects cloud users can take solace in having one fewer compliance concern as part of the pressure is absorbed by the cloud provider who will have their own regulatory practices to uphold once GDPR comes into force.
“When you offer your own infrastructure what you tend to find is it slows down deployment”
Industry experts have also penalised public cloud systems for restricting communication across internal tech teams and acting as a half-baked, quick fix for a poorly structured tech set-up, but Gaviola is quick to dismiss this suggestion, insisting the case is “quite the opposite”. He explains: “What public cloud does is it changes the operational model of DevOps teams but they’re certainly not mutually exclusive, or the enemy. What we typically see when we engage with DevOps teams is that we want to empower them more efficiently in providing services to the group.
“More specifically, coming back to this idea of the containers, we’re seeing a lot of DevOps teams that are interested in re-platforming their own architecture on containers and Kubernetes. It helps them react more efficiently to the demands of the business and the applications they power.” Cloud apparently eases the manner in which DevOps teams operate their infrastructure, particularly as they do not have to rely on an internal tech team to provide certain services. “That makes it a lot more agile for [an] enterprise to operate.”
So, what’s next for cloud software? It is difficult to keep up with an almost entirely intangible concept like the cloud, and perhaps that is the main reason behind the scepticism of many companies but Gaviola says that wider tech companies are increasingly more interested in making the move, but it is down to the cloud providers to meet the customers’ requirements. “I think when we look ahead at where we see some of the next steps it is starting to make that journey a lot easier.”