AzureCraft summer 2016 keynote review

A review of Scott Guthrie's keynote speech on Day 1 of AzureCraft, summer 2016.

Posted
16 June 2016
Occurred
3 June 2016

AzureCraft is a free conference that's organised by Microsoft and the UK Azure User Group (@ukazure). This summer's event ran over two days. The first day consisted of lectures and workshops and was aimed at developers, IT professionals, data professionals and architects. The second day was hosted by Microsoft at their UK HQ in Thames Valley Park, Reading, and consisted of games, workshops, a hackathon and was aimed at tech innovators, web and app developers, makers, data developers and entrepreneurs of all ages, from 10 years upwards.

I attended the first day of the conference and this article is a review of Scott Guthrie's (Executive Vice President of the Cloud and Enterprise group in Microsoft, @scottgu) opening keynote speech. Before Scott took to the stage at the Mermaid Theatre, Richard Conway (@azurecoder) from the UK Azure User Group gave a quick introduction to the conference. He explained that the name AzureCraft is the fusing together of Microsoft Azure with Minecraft, to reflect the technical and maker goals of days 1 and 2 of the conference.

Introduction

After a round of applause, Scott took to the stage. First up he explained that Azure is Microsoft's open, flexible, enterprise-grade cloud computing platform where you can stand up VMs, infrastructure, platform services that run all over the world. Azure has 32 regions, which is more than AWS (Amazon Web Services) and Google Cloud Platform combined. Scott revealed that two new Azure datacentres are being built in the UK - one of which will be in London. Azure datacentres are huge sites and Scott showed an aerial photo of one Azure datacentre that is over two miles in length. Microsoft Azure, Scott went on to say, gives you choice and flexibility. It lets you use the tools and frameworks you are familiar with. Azure supports pretty much every OS in the world and every programming language.

32 Azure regions around the world / Credit: @msdevuk

Azure Stack

Azure is enterprise ready. The recently launched Azure Stack enables a consistent programming experience, giving customers the flexibility to run on-premises, in the Azure cloud or to take a hybrid approach leveraging both environments. Scott explained how Azure Stack allows companies operating in highly regulated environments to move applications and data back and forth between on-premises and the public cloud as compliance demands.

Developers can use a "write once, deploy to Azure or Azure Stack" approach. Using APIs that are identical to Microsoft Azure, they can create applications based on open source or .NET technology that can run on-premises or in the public cloud. IT professionals can transform on-premises datacentre resources into Azure IaaS / PaaS services while maintaining oversight using the same management and automation tools that Microsoft uses to operate Azure.

Organizations can embrace hybrid cloud computing on their terms by helping them address business and technical considerations like regulation, data sovereignty, customization and latency. Azure Stack enables that by giving businesses the freedom to decide where applications and workloads reside without being constrained by technology.

Productivity

Scott went on to say that Azure is about productivity. It allows you to move faster, do more and have fun doing it. Scott went on to ask the audience, "At your organisation, how long does it take to stand up a new database?" Answers from the crowd ranged from days to months, but seemed to average about 6 weeks. Next came the first demo. To cheers from the crowd, Scott logged in to the Azure portal and created a new SQL database that was ready to use in 15 seconds. The same is true of VMs. It takes only a minute or two to stand up VMs running, for example, Windows Server or Ubuntu.

Resource Health

Azure Resource Health is a new feature that exposes the health of individual Azure resources and provides actionable guidance to troubleshoot problems. The goal for Resource Health is to reduce the time customers spend on troubleshooting, in particular reducing the time spent determining if a problem is inside an application or if it is caused by an event inside the Azure platform. Scott explained that Resource Health information can be accessed programmatically or via new sections (aka blades) in the Azure portal. For VMs, resource health gives you access to a wealth of information including boot diagnostics, console logs and performance counters.

Azure Momentum

Next, Scott presented some impressive statistics that show Azure's momentum:

  • More than 120,000 new Azure customer subscriptions / month
  • 1.6 million SQL databases in Azure
  • 2 trillion messages per week processed by Azure IoT
  • 600 million Azure Active Directory users
  • More than 4 million developers registered with Visual Studio Team Services
  • More than 40% of revenue from Startups and ISVs

Scott explained that more than 85% of Fortune 500 companies use Microsoft Azure and then talked about companies that are using Azure.

Azure momentum / Credit: @msdevuk

AccuWeather

Azure on-demand scalability enables the collection and crunching of real-time weather data from 3 million locations across the globe. Azure enables AccuWeather to respond to 10 billion weather data requests per day.

BMW

BMW Connected is a digital mobility experience based on the Open Mobility Cloud using Azure. Journey management is at the core of the first version of BMW connected, providing travel-planning before a trip and continued services afterward. To accomplish that, BMW's platform captures data from different sources, including real-time traffic conditions, and makes it easy to add destinations from various sources like a user's calendar, contacts, messages, apps and habits learned over time.

Azure's platform-as-a-service (PaaS) products such as App Service give BMW's platform resilience and scalability, while Azure's global network allows for a seamless rollout of services worldwide. Service Fabric enables the automaker to build individual mobility graphs with personalized data and real-time context. Event Hubs handles data injections, HDInsight manages large amounts of unstructured data and Azure Machine Learning enables intelligent, scalable systems.

Rolls-Royce

Rolls-Royce are integrating Azure IoT Suite and Cortana Intelligence Suite into its service solutions to expand its digital capabilities to support the current and next generation of Rolls-Royce intelligent engines. Rolls-Royce build half the jet engines in the world. Azure IoT Suite allows Rolls-Royce to analyse and visualise large quantities of telemetry from engines, which can be used to improve fuel efficiency and help prevent engine failure.

Rolls-Royce use Azure / Credit: @msdevuk

TalkTalk TV

In 2015, TalkTalk, the UK's third largest cable TV provider, acquired blinkbox, a movie and TV streaming service. Blinkbox was rebranded TalkTalk TV. Prior to the acquisition, blinkbox had already moved their video encoding and streaming application to Azure running in multiple regions to gain the flexibility of Azure's Elastic Compute and Storage. Blinkbox first migrated most of their existing application in a lift-and-shift fashion using 100 virtual machines (VMs) on Azure infrastructure-as-a-service (IaaS). Following acquisition and continued rapid growth, parts of TalkTalk TV were rewritten to use microservices running on Azure Service Fabric. With independently deployable microservices, TalkTalk TV are now able to quickly add or modify functionality, something that wasn't easy to do with the old n-tier architecture running on multiple VMs.

Transport for London

Transport for London (TfL) have developed Trackernet, a real-time display of the London Underground tube network, which is hosted on Azure. Trackernet displays the locations of trains, destinations, signal aspects and the status of individual trains at any given time. The data from Trackernet and other TfL sources is made available through a unified API where interested parties can access a single view of all data shared by TfL.

The Visual Studio Family

Next up, Scott described the Visual Studio Family, which consists of:

  • Visual Studio - a rich, integrated development environment for creating applications for Windows, Android, and iOS, as well as modern web applications and cloud services
  • Visual Studio Code - a code optimized editor for building and debugging modern web and cloud applications on Windows, Mac OS X or Linux
  • Visual Studio Team Services - cloud based collaboration services for version control, agile planning, continuous delivery, and application analytics
  • Xamarin - for cross platform mobile development

Xamarin

Xamarin, Scott said, is "loved by developers", "trusted by enterprise" and now free with Visual Studio. Xamarin allows developers to build native mobile apps on Android, iOS, and Windows with C#, while Xamarin Test Cloud allows developers to test their applications on hundreds of Android devices.

Scott welcomed Xamarin Progam Manager Mike James (@MikeCodesDotNet) to the stage, who gave a quick demo of Xamarin's capabilities. Mike built a simple iOS app consisting of a map that could be scrolled and zoomed and a label that displayed the position (longitude and latitude) of the map. Mike showed the app running in Xamarin's remoted iOS simulator. Mike explained how on modern Windows machines with touchscreens, the remoted iOS simulator window can be swiped to test your app's response to gestures such as pinching, swiping, and multiple-finger touch gestures. Previously this level of testing was only possible on real physical devices.

App Service

After Xamarin, Scott talked about App Service. App Service is a cloud platform for building web and mobile apps that connect to data anywhere, in the cloud or on-premises. To demonstrate App Service, Scott fired up Visual Studio and used a project template to create a fully functional ASP.NET MVC web application. To validate what was going to happen next, Scott asked the audience for a name that he could use to replace some text on the About page of the web application. There were a few muffled suggestions from the audience and then someone shouted out (what we think was), "Billings Badger". So, Scott changed the text to read, "Billings Badger - I don't know what that means but ok". And, in fact, Scott joked with the audience that he hoped this wasn't rude. I don't think any of us knew what "Billings Badger" meant, but hey!

App Service / Credit: @msdevuk

With the text changed, Scott showed how quick and easy it is to deploy a website using Visual Studio and App Service. In fact, all you have to do is right click on your Web Application project in Solution Explorer, select Publish, select "Microsoft Azure App Service" as your publish target, enter Azure subscription details and you're pretty much done. Within 5 or so seconds Scott's website was published to Azure and available at a azurewebsites.net URL. Sure enough, the "Billings Badger" text was visible on the About page of the website, proving the publish and deployment was not staged.

Scott explained how you can "Auto Scale" web applications based on parameters such as CPU usage. For example, if CPU usage goes above 70% you might want to add another VM to handle requests. If CPU usage drops below 20% you might want to remove a VM. If you want to be notified when thresholds are breached you can setup alerts that trigger emails or so-called webhooks to be fired.

Build and Scale Apps / Credit: @msdevuk

Resource Health lets you see if your web apps are running normally and, if not, provides methods for troubleshooting them. You can view HTTP traffic in real time from within the Azure Portal and Site metrics gives you access to loads of performance counters.

Finally, Scott described Deployment Slots. Deployment Slots allow you to validate changes to your web app in a staging deployment slot, before swapping with a production slot. After a swap, the slot with the previously staged web app has the previous production web app. If the changes swapped into the production slot are not as expected, you can perform the same swap immediately to get back your "last known good site". You can create up to 10 deployment slots and can configure Auto Swap if you want a swap to occur automatically, whenever updated code is pushed to a slot. Each deployment slot has its own URL such as "mywebapp-staging.azurewebsites.net" and ACLs can be setup to prevent unauthorised access to non-production deployment slots.

Functions

Next up, Scott introduced Azure Functions aka "serverless compute". With Azure Functions, you can run a snippet of code without the overhead of VMs or App Service. To demonstrate Functions, Scott created a new Function App using the Azure Portal. Scott accepted most of the default options to create a "Webhook + API" Function using the C# programming language. The Function is executed via a URL and the default sample code that gets created reads the value of the "name" query string parameter and writes to the response "Hello " + name. The default sample code shows how to log information when the Function is executed, which can be viewed in real time in the Azure Portal.

You can use a number of different languages to write functions, including C#, Node.js, Python, F#, PHP, batch, bash and Java. You only pay per execution of your code. Functions supports NuGet and NPM. Plus you can protect HTTP triggered functions with Oauth providers such as Azure Active Directory, Facebook, Google, Twitter and Microsoft Account. What's more, the Functions runtime is open source and available on GitHub, so you can build and run it anywhere.

Many developer platforms and services, Scott explained, support webhooks allowing events on one site to invoke behaviour on another. To illustrate this, Scott showed how an Azure Function could be called in response to opening an issue on GitHub using webhooks. After the GitHub demo, Scott created a timer-triggered Function that was invoked every 5 seconds and simply logged the current date and time.

Finally, Scott showed how easy it is to integrate other Azure services with Functions. In his final Functions demo, Scott showed how a Function can be triggered when an image (blob) is created in Azure Storage. Each time an image was created in Azure Storage, Scott's demo Function called a Vision API to identify words in the image and write them to a database. All very impressive stuff!

Visual Studio Team Services

By now, Scott was running out of time, so the remaining content was covered quickly. I did my best to scrawl down some rough notes! Next up was tools for DevOps and Visual Studio Team Services. The focus of this part of the keynote was on continuous integration and deployment. Scott logged into visualstudio.com, where he was able to view the source code for a number of his different projects. Scott explained how code checkins can be setup to trigger automated builds. The Build tab within Team Services shows the status of queued, completed and deleted builds. Builds can run Grunt, Gulp, NPM and NuGet package managers, compile code and run unit tests. Following a successful build, applications can then be automatically deployed to Azure.

Scott explained that when a DevOps team needs control over deployment of code into particular environments, rules can be setup governing the release process. For example, if QA have to signoff a build before it is deployed to Production, then a rule can be setup to enforce this. Additional functionality can be integrated with build workflows using webhooks. For example, you might setup an Azure Function that sends an email after a successful build. Scott briefly mentioned that Team Services can be used to perform cloud load testing of your applications.

In a nutshell, Team Services is about enforcing a repeatable build and deployment process that reduces bugs and improves quality.

Application Insights

Scott explained that one of the biggest challenges for teams with large applications is determining the root cause of problems when they occur. This can be hard when, for example, your web app is getting thousands of hits per second. How do you detect, triage and diagnose issues from the large amounts of telemetry? To address this issue, Scott introduced Application Insights. If you have a .NET application, it's quick and easy to get up and running with Application Insights. All you need to do is reference the Application Insights DLL in your .NET project and, within minutes, you'll get basic telemetry out of the box. The Application Insights API allows you to create custom telemetry, while the Azure Portal lets you build dashboards based on this telemetry.

Application Insights / Credit: @msdevuk

Application Maps scan telemetry to automatically discover your application topology, layering performance information on top of it. They let you identify performance bottlenecks and problematic flows across your distributed environment. Application Maps are interactive. You can click on them to drill down into specific areas of your applications topology. For example, if your application is failing due to performance degradation in the SQL tier, you can click on the SQL tier to drill down into SQL Database Advisor in order to understand the problem.

Service Fabric

With limited time left, Scott gave a quick overview of Service Fabric. Service Fabric is a distributed systems platform that makes it easy to package, deploy, and manage scalable and reliable microservices.

For those that have not come across microservices, Fowler defines the microservice architectural style as an approach to developing a single application as a suite of small services, each running in its own process and communicating with lightweight mechanisms, often an HTTP resource API. Fowler goes on to say that these services are built around business capabilities and independently deployable by fully automated deployment machinery.

Scott showed how to "Dockerise" an application. Docker is a microservices implementation. First of all Scott used Visual Studio to create an ASP.NET Core 1.0 web application. ASP.NET Core applications are built on the new .NET Core platform and can be run on Windows, Linux or Mac. Scott pointed out that he had already installed the Visual Studio extension "Docker Tools for Visual Studio 2015". With this extension, you can right-click your web application and select add Docker Support. This adds all of the files required to "Dockerise" your application and changes the target host environment from IIS Express to Docker. In order to run Docker containers locally, Scott had setup Debian Linux running on Hyper-V. Scott hit F5 and Visual Studio connected to the local Docker container, deployed the code and ran the ASP.NET Core 1.0 web application. Scott then showed how you can set breakpoints and debug applications that are running in a Debian Linux Docker container.

To deploy the "Dockerised" app to Azure, Scott went to the Azure Portal and stood up an Azure Container Service where he could upload his Docker image and configure settings such as the size and number of hosts, choice of orchestrator (Docker Swarm and Compose or Apache Mesos). Scott briefly showed how Marathon, an open source container-orchestration engine created and maintained by Mesosphere, can be used to deploy and manage containers running in the Container Service. Scott added that Team Services can be configured to automatically build and deploy Docker containers.

Scott mentioned a London based startup, Microscaling Systems, who spoke about Container Service at this year's Build event. A video of their talk can be found here.

DocumentDB

Finally, Scott rushed through the Data and Analytics section of his keynote. He started by talking about DocumentDB. DocumentDB is a fully managed NoSQL database service built for fast and predictable performance, high availability and automatic scaling. The service fully supports geo replication and failover between regions. Scott also explained that DocumentDB has protocol support for MongoDB. What this means is that DocumentDB databases can now be used as the data store for apps written for MongoDB. Using existing drivers for MongoDB, applications can communicate with DocumentDB, in many cases by simply changing a connection string.

Despite the rush to complete the keynote just slightly over time, Scott showed us a You Tube video describing how the video game "The Walking Dead: No Man's Land" uses DocumentDB. You can watch the video here.

The Walking Dead: Non Man's Land / Credit: Microsoft Azure

SQL Database

The very last items on the keynote agenda were SQL Database and SQL Data Warehouse. In the SQL Database space, Scott mentioned Elastic Pools. Simply put, Elastic Pools provide a simple cost effective solution to manage the performance goals for multiple databases that have widely varying and unpredictable usage patterns.

When might you use an Elastic Pool? If your application needs to support multiple tenants, there are two ways you could go about this. With the single-tenant database model, you would deploy a database per tenant. With the multiple-tenant database model, you would presumably add a tenant column to most of your database tables. Elastic Pools are an affordable way to deploy a database per tenant. They ensure that databases get the performance resources they need, when they need it, while providing a simple resource allocation mechanism within a predictable budget.

SQL Data Warehouse

Finally, Scott spoke briefly about SQL Data Warehouse. SQL Data Warehouse is a cloud-based, scale-out database capable of processing massive volumes of data - both relational and non-relational. You can store petabytes of data with massively parallel processing. You can import data from Azure resources such as DocumentDB and SQL Database and use tools such as Power BI to analyze the data and create rich dashboards.

Compute usage in SQL Data Warehouse is measured using SQL Data Warehouse Units (DWUs). DWUs are a measure of underlying power that your Data Warehouse has. When you need faster results, you can increase your DWUs and pay for greater performance. Conversely, when you need less compute power, you can decrease your DWUs and pay only for what you need.

Threat Detection provides a layer of security, which enables customers to detect and respond to potential threats as they occur by providing security alerts on anomalous activities (such as SQL injection attempts).

Auditing tracks database events and writes audited events to an audit log in your Azure Storage account. Auditing can help you maintain regulatory compliance, understand database activity, and gain insight into discrepancies and anomalies that could indicate business concerns or suspected security violations.

Conclusion

It's always great to listen to Scott Guthrie speak and today was no exception! His keynote speech was a fantastic conference opener. For those new to Azure, the keynote was a great overview of Azure's many integrated cloud services. For those who have used Azure before, the keynote was a great opportunity to hear about some of the platform's newer services. There is talk of another AzureCraft conference happening around Christmas time. Will I be there? Absolutely! Thanks to Scott Guthrie, the UK Azure User Group and everyone involved in the organisation of AzureCraft 2016.

Photos courtesy of Microsoft Developer (@msdevUK) and Mike James (@MikeCodesDotNet).

Article written by Mike Puddephat.

This forum has no comments. Why don't you add one?