• Five Things to Know About SQL Server’s In-Memory Technology

    Last week was an exciting week for the SQL Server team, as one of our favorite events happened – PASS Summit. If you attended PASS, you probably heard a ton about the latest version of SQL Server 2014.

    One of the key drivers of SQL 2014’s design was the in-memory technology that is built into the product. These capabilities and the way they were designed are a key differentiator for SQL Server 2014. Recently we discussed how using SQL Server 2014’s in-memory technology can have a dramatic impact on your business – speeding transactions, queries, and insights. Today let’s  delve a little deeper into our in-memory solution and our unique approach to its design.

    We built in-memory technology into SQL Server from the ground up, making it the first in-memory database that works across all workloads. These in-memory capabilities are available not only on-premises, but also in the cloud when you use SQL Server in an Azure VM or use the upcoming in-memory columnstore capabilities within Azure SQL Database. So just what makes our approach so unique? This video describes it well.

    We have five core design points for SQL Server in-memory. These are: 

    1. It’s built-in. If you know SQL Server, you’re ready to go. You don’t need new development tools, to rewrite the entire app, or learn new APIs.
    2. It increases speed and throughput. SQL Server’s in-memory OLTP design removes database contention with lock and latch-free table architecture while maintaining 100 percent data durability. This means you can take advantage of all your compute resources in parallel, for more concurrent users.
    3. It’s flexible. Your entire database doesn’t need to be in-memory. You can choose to store hot data in-memory and cold data on disk, while still being able to access both with a single query. This give you the ability to optimize new or existing hardware.
    4. It’s easy to implement. The new migration advisory built right into SQL Server Management Studio lets you easily decide what to migrate to memory.
    5. It’s workload-optimized. In-memory OLTP is optimized for faster transactions, enhanced in-memory ColumnStore gives you faster queries and reports, and in-memory built into Excel and Analysis Services speeds analytics.  

    All of this combined leads to up to 30x faster transactions, over 100x faster queries and reporting, and easy management of millions of rows of data in Excel. Think about what this can do for your business.

    Learn more about SQL Server 2014 in-memory, or try SQL Server 2014 now. 

  • Available Now: Preview Release of the SQL Server PHP Driver

    Today we are pleased to announce the availability of a community technology preview release of the SQL Server Driver for PHP! Download the preview driver today here.

    This release will allow developers who use the PHP scripting language version 5.5 to access Microsoft SQL Server and Microsoft Azure SQL Database.   The full source code for the driver will be made available on GITHUB, at https://github.com/azure/msphpsql.
     
    The updated driver is part of SQL Server’s wider interoperability program, which includes the upcoming release of a JDBC Driver for SQL Server compatible with JDK7. This driver will enable customers to develop applications that connect to SQL Server against Java 7 and move forward with the Java platform.

    We look forward to hearing your feedback about the new driver. Let us know what you think via GitHub.

  • Microsoft announces major update to Azure SQL Database, adds free tier to Azure Machine Learning

    This morning at the Professional Association for SQL Server (PASS) Summit, we celebrated SQL Server 2014’s strong momentum and introduced new services that further expand Microsoft’s big data platform. We announced a major update coming to our database-as-a-service, Azure SQL Database, and easier access to our machine learning service, Azure Machine Learning. These new updates continue our efforts to bring big data to everyone by delivering a comprehensive platform that ensures every organization, every team and every individual is empowered to do more and achieve more because of the data at their fingertips.

    I’m really pleased to be making these announcements today at PASS Summit, where, along with my colleagues Joseph Sirosh, corporate vice president of Machine Learning and Information Management; and James Phillips, general manager of Data Experiences; I delivered a keynote highlighting the momentum of SQL Server 2014 and other recent releases in our data platform such as Azure Stream Analytics, Azure Data Factory, Azure DocumentDB, Azure Search and Azure HDInsight. As the world’s largest gathering of SQL Server and business intelligence professionals, PASS is hugely important, enabling us to connect with SQL Server customers and gain valuable feedback to help inform the product’s development.

    Customers embrace SQL Server 2014

    SQL Server is the cornerstone of our big data platform. It is the world’s most widely-deployed database and is used across the globe for mission critical enterprise deployments. Last month, based largely on our work with SQL Server, Microsoft was recognized as a Leader in Gartner's Magic Quadrant for Operational Database Management Systems, and positioned furthest to the right in completeness of vision*. Earlier this year, we released SQL Server 2014, which includes built-in breakthrough in-memory OLTP and columnstore technologies, as well as hybrid cloud capabilities. Since then, SQL Server 2014 has seen tremendous growth and positive reception among customers, with more than 1.2 million downloads to date and 30 percent of Azure SQL Server virtual machines currently running SQL Server 2014.

    Clalit, Dell, Eastman Chemical Company, GameStop, Kiwibank, LC Waikiki, Pros, Saab and Stack Overflow are just a few of the customers using SQL Server 2014. GameStop is using SQL Server 2014 in two main scenarios: disaster recovery and backup to Azure to accommodate the company’s infrastructure freeze for holidays and big game launches, and as the default install for new SQL Server instances. Stack Overflow is a question and answer site for professional and enthusiast programmers. By basing their platform on technologies like SQL Server 2014 (specifically taking advantage of AlwaysOn Availability Group replicas), they can have a highly available, high-performing platform that easily and quickly gets answers to thousands of global users.

    Azure SQL Database, Azure Machine Learning

    Later this year, we will preview a new version of Azure SQL Database that represents another major milestone for this database-as-a-service. With this preview, we will add SQL Server capabilities that will make it easier to extend and migrate applications to the cloud, including support for larger databases with online indexing and parallel queries, improved T-SQL support with common language runtime and XML index, and monitoring and troubleshooting with extended events. In addition, the preview will unlock our in-memory columnstore, which will deliver greater performance for data marts and continue our journey of bringing in-memory technologies to the cloud. We will offer these new preview capabilities as part of the service tiers introduced earlier this year, which deliver 99.99% availability, larger database sizes, restore and geo-replication capabilities, and predictable performance.

    Microsoft Azure Machine Learning is a fully managed cloud service for building predictive analytics solutions, and helps overcome the challenges most businesses face in deploying and using machine learning. Starting today, it will be easier than ever for anyone to try Azure Machine Learning, as the service is now available to test free of charge without a subscription or credit card – all customers need to get started is a Microsoft account ID. This free tier is one more way Azure Machine Learning is making advanced analytics more accessible to more people. DBAs, developers, business intelligence professionals and nascent data scientists can now experiment with Azure Machine Learning at no cost.

    Microsoft Data Platform

    We are making all these investments in SQL Server and the rest of our data platform because we are living and working in an amazing time where organizations are utilizing data to make smarter decisions, better predict their customers’ needs and provide more differentiated products and services. Data has become the new currency and it is helping to differentiate today’s leading companies. 

    To get there, organizations need a comprehensive platform to capture and manage all of their data, transform and analyze that data for new insights, and provide tools which enable users across their organization to visualize data and make better business decisions. Microsoft’s approach is to make it easier for our customers to work with data of any type and size – using the tools, languages and frameworks they want – in a trusted environment on-premises and in the cloud. To learn more about our approach to big data, visit our web page.

    *Disclaimer:
    Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

  • The Ins and Outs of Azure Data Factory – Orchestration and Management of Diverse Data

    Yesterday at TechEd Europe 2014, Microsoft announced the preview of Azure Data Factory. This post will give you the ins and outs of this new service.

    What is Azure Data Factory?

    Azure Data Factory is a fully managed service that does information production by orchestrating data with processing services as managed data pipelines. A pipeline connects diverse data (like SQL Server on-premises or cloud data like Azure SQL Database, Blobs, Tables, and SQL Server in Azure Virtual Machines) with diverse processing techniques (like Azure HDInsight (Hive and Pig), and custom C# activities).  This will allow the data developer to transform and shape the data (join, aggregate, cleanse, enrich) so that it becomes authoritative and trustworthy to be consumed by BI tools. These pipelines are all managed within a single pane of glass where rich health and lineage is available to diagnose issues or do impact analysis across all data and processing assets. Some unique points about Data Factory are:

    • Ability to process data from diverse locations and data types.  Data Factory can pull data from relational, on-premises sources like SQL Server and join with non-relational, cloud sources like Azure Blobs.
    • Provide a holistic view of the entire IT infrastructure that includes both commercial and open source together. Data Factory can orchestrate Hive and Pig using Hadoop while also bringing in commercial products and services like SQL Server and Azure SQL Database in a single view.

    What can it do?

    With the ability to manage and orchestrate the collection, movement and transformation of semi-structured and structured data together, Data Factory provides customers with a central place to manage their processing of web log analytics, click stream analysis, social sentiment, sensor data analysis, geo-location analysis, and more. In public preview, Microsoft views Data Factory as a key tool for customers who are looking to have a hybrid story with SQL Server or who currently use Azure HDInsight, Azure SQL Database, Azure Blobs, and Power BI for Office 365. In the future, we’ll bring more data sources and processing capabilities to the Data Factory.

    How do I get started?

    For Microsoft customers, we are offering Azure Data Factory as a public preview.  To get started, customers will need to have an Azure subscription or a free trial to Azure. With this in hand, you should be able to get Azure Data Factory up and running in minutes. Start by reading this getting started guide.

    For more information on Azure Data Factory:

  • The Ins and Outs of Azure Stream Analytics – Real-Time Event Processing

    Yesterday at TechEd Europe 2014, Microsoft announced the preview of Azure Stream Analytics. This post will give you the ins and outs of this new service.

    What is Azure Stream Analytics?

    Azure Stream Analytics is a cost effective event processing engine that helps uncover real-time insights from devices, sensors, infrastructure, applications, and data. Deployed in the Azure cloud, Stream Analytics has elastic scale where resources are efficiently allocated and paid for as requested. Developers are given a rapid development experience where they describe their desired transformations in SQL-like syntax. Some unique aspects about Stream Analytics are:

    • Low cost: Stream Analytics is architected for multi-tenancy meaning you only pay for what you use and not for idle resources.  Unlike other solutions, small streaming jobs will be cost effective.
    • Faster developer productivity: Stream Analytics allow developers to use a SQL-like syntax that can speed up development time from thousands of lines of code down to a few lines.  The system will abstract the complexities of the parallelization, distributed computing, and error handling away from the developers.
    • Elasticity of the cloud: Stream Analytics is built as a managed service in Azure.  This means customers can spin up or down any number of resources on demand.  Customers will not have to setup costly hardware or install and maintain software.

    Similar to the recent announcement Microsoft made in making Apache Storm available in Azure HDInsight, Stream Analytics is a stream processing engine that is integrated with a scalable event queuing system like Azure Event Hubs. By making both Storm and Stream Analytics available, Microsoft is giving customers options to deploy their real-time event processing engine of choice.

    What can it do?

    Stream Analytics will enable various scenarios including Internet of Things (IoT) such as real-time fleet management or gaining insights from devices like mobile phones and connected cars. Specific scenarios that customers are doing with real-time event processing include:

    • Real-time ingestion, processing and archiving of data: Customers will use Stream Analytics to ingest a continuous stream of data and do in-flight processing like scrubbing PII information, adding geo-tagging, and doing IP lookups before being sent to a data store.
    • Real-time Analytics: Customers will use Stream Analytics to provide real-time dashboarding where customers can see trends that happen immediately when they occur.
    • Connected devices (Internet of Things): Customers will use Stream Analytics to get real-time information from their connected devices like machines, buildings, or cars so that relevant action can be done. This can include scheduling a repair technician, pushing down software updates or to perform a specific automated action.

    How do I get started?

    For Microsoft customers, we are offering Azure Stream Analytics as a public preview.  To get started, customers will need to have an Azure subscription or a free trial to Azure. With this in hand, you should be able to get Azure Stream Analytics up and running in minutes. Start by reading this getting started guide.

    For more information on Azure Stream Analytics: