Azure annonces pour novembre 2018

Azure App Service | WildFly on Linux is in preview

WildFly, a popular, lightweight, open source, enterprise Java application server is now available in preview on Azure App Service on Linux. This fully managed app server, powered by Azul Zulu JDK, reaffirms Microsoft’s commitment to providing free LTS support and maintenance to Azure customers. WildFly on Azure is a standalone instance, but can be configured to scale and connect with other Azure services for enterprise-grade performance.

Azure Virtual Machines (VMs) | HB VMs for HPC are in preview

New H-series Azure VMs for HPC workloads are in preview

Two new H-series (HB and HC) Azure Virtual Machines for high-performance computing (HPC) workloads are now available in preview.

The HB-series VMs are optimized for HPC applications driven by memory bandwidth, such as fluid dynamics, explicit finite element analysis, and weather modeling. The HB VMs feature 60 AMD EPYC 7551 processor cores, 4 GB of RAM per CPU core, no hyperthreading, and up to four managed disks. The AMD EPYC platform provides more than 260 GB/sec of memory bandwidth.

HC-series VMs are also now in preview.These are optimized for HPC applications driven by intensive computation, such as implicit finite element analysis, reservoir simulation, and computational chemistry. HC-series VMs feature 44 Intel Xeon Platinum 8168 processor cores, 8 GB of RAM per CPU core, no hyperthreading, and up to four managed disks. The Intel Xeon Platinum platform supports Intel’s rich ecosystem of software tools and features an all-cores clock speed of 3.4 GHz for most workloads. Both VMs are now in preview.

Azure Event Hubs | Event Hubs for Kafka is now available

Azure Event Hubs for Apache Kafka is now generally available 

With Azure Event Hubs for Apache Kafka, we’re bringing together two powerful distributed streaming platforms so you can access the breadth of Kafka ecosystem applications without having to manage servers or networks. Event Hubs is a fully managed, real-time data ingestion service that’s simple, trusted, and scalable. As a distributed streaming platform, Event Hubs lets you stream your data from any source—storing and processing millions of events per second—so you can build dynamic data pipelines and respond to business challenges in real time. Event Hubs for Kafka provides a Kafka endpoint so that any Kafka client running Kafka 1.0 or newer protocols can publish and subscribe events to and from Event Hubs with a simple configuration change. Easily access Kafka ecosystem applications like MirrorMaker while also benefitting from Event Hubs features like Capture (automatic delivery to Blob or Data Lake storage),  Auto-Inflate (auto-scaling of throughput), and Geo Disaster-Recovery. With this integration, you can easily load data from on-premises to the cloud, and unlock analytics with native Azure services such as Azure Stream Analytics and Azure Databricks.

Azure SQL Data Warehouse | Row level security now supported

Azure SQL Data Warehouse (SQL DW) now supports row level security (RLS) adding a powerful capability to secure your sensitive data. With the introduction of RLS, you can implement security policies to control access to rows in your tables, as in who can access what rows. RLS enables this fine-grained access control without having to redesign your data warehouse. This simplifies the overall security model as the access restriction logic is located in the database tier itself rather than away from the data in another application. RLS also eliminates the need to introduce views to filter out rows for access control management. There’s no additional cost for this enterprise-grade security feature for all our customers.

To learn more, read this Azure blog.

Azure SQL Data Warehouse | Azure Data Lake Storage Gen2 integration

Azure SQL Data Warehouse (SQL DW) now has native integration with Azure Data Lake Storage Gen2. Load data using external tables from ABFS into SQL DW. This functionality enables customers to integrate with their data lakes in Data Lake Storage Gen2. To learn more, read this Azure blog

Azure SQL Data Warehouse | Maintenance scheduling

Azure SQL Data Warehouse (SQL DW) maintenance scheduling is now available. This new feature seamlessly integrates the Azure Service Health planned maintenance notifications with Resource Health monitor services. Maintenance scheduling allows customers to plan around the scheduled maintenance events the Azure SQL DW service uses to roll out new features, upgrades, and patches. There’s no additional cost for this enterprise-grade security feature.

To learn more, read this Azureblog.

Azure Virtual Machines | NDv2 in preview

New GPU enabled NDv2Azure VMs for AI and HPC scenarios are now in preview

Azure Cognitive Services | Container support in preview

Container support is now available in preview for a few of the Azure Cognitive Services, including Computer Vision, Face, and Text Analytics. Deploy Azure Cognitive Services on-premises and on the edge with container support.. Cognitive Services containers allow developers to use the same intelligent APIs that are available in Azure, but with the flexibility that comes with Docker containers. Use Cognitive Services with complete control over your data, which is essential when you can’t send data to the cloud but need access to Cognitive Services technology. Cognitive Services containers give you flexibility in versioning and updating of models deployed in your solutions, and also enable the creation of a portable application architecture that can be deployed in the cloud, on-premises, and on the edge.

Please referto our documentation for additional details.

Visual Studio 2017 | Version update

A version update to Visual Studio 2017 is now available. With this update, take advantage of everything listed below (and more) for free.

Please note that following our support policy, this final update to Visual Studio 2017 is designated as the “Service Pack” and will be the only supported version starting January 14, 2020.

  • Installation
    • Import and export an installation configuration file that specifies which workloads and components should be installed with an instance of Visual Studio.
  • C++ development
    • Visual Studio Enterprise customers now have access to the step back feature in the debugger for C++. Additionally, we updated the Desktop Bridge framework packages and support ARM64 C++ scenarios.
  • UWP development
    • ARM64 support has been added for .NET UWP apps built (.NET Native), as well as creation of .MSIX packages. We’ve also improved the designer experience with platform only mode and fallback controls.
  • .NET mobile development
    • Xcode 10 is now supported, which enables you to build apps for iOS 12, tvOS 12, and watchOS 5. Xamarin.Android build performance has also been improved.
  • Other languages
    • F# developers will see several improvements when using byref—we’ve updated to the latest Vue CLI 3.0, and now support project references for TypeScript projects.
  • SharePoint 2019 support
    • We added new templates that allow you to create projects for SharePoint 2019. Migrate existing SharePoint projects from both SharePoint 2013 and 2016 to the new template.

SQL Server 2014 Service Pack 3 | GA

SQL Server 2014 Service Pack 3 is now available. SQL Server 2014 SP3 contains more than 25 improvements centered around performance, scalability, and diagnostics based on the feedback from customers and the SQL community. These improvements enable SQL Server 2014 to perform faster and scale out of the box on modern hardware design. It also showcases the SQL product team’s commitment to provide continued value into in-market releases. Additional release information can be found in the this blog post and this article.

News Data Migration Assistant in Preview

Data Migration Assistant | In preview

Data Migration Assistant: Support for Azure SQL Database Managed Instance now in preview

Now in preview, Data Migration Assistant provides support to help you to migrate on-premises SQL Server databases to Azure SQL Database Managed Instance. Data Migration Assistant detects compatibility and feature parity issues that can impact database functionality in target Azure SQL Database Managed Instance databases.

Learn more about:

Data Migration Assistant

How To Find Size of all index

OBJECT_NAME(i.OBJECT_ID) AS TableName, AS IndexName,
i.index_id AS IndexID,
8 * SUM(a.used_pages) AS ‘Indexsize(KB)’
FROM sys.indexes AS i
JOIN sys.partitions AS p ON p.OBJECT_ID = i.OBJECT_ID AND p.index_id = i.index_id
JOIN sys.allocation_units AS a ON a.container_id = p.partition_id
GROUP BY i.OBJECT_ID,i.index_id,

Temps de traitement par Query

  — Get top total worker time queries for entire instance (Top Worker Time Queries)

SELECT TOP(50) DB_NAME(t.[dbid]) AS [Database Name], –LEFT(t., 50) AS [Short Query Text],

qs.total_worker_time AS [Total Worker Time], qs.min_worker_time AS [Min Worker Time],

qs.total_worker_time/qs.execution_count AS [Avg Worker Time],

qs.max_worker_time AS [Max Worker Time],

qs.min_elapsed_time AS [Min Elapsed Time],

qs.total_elapsed_time/qs.execution_count AS [Avg Elapsed Time],

qs.max_elapsed_time AS [Max Elapsed Time],

qs.min_logical_reads AS [Min Logical Reads],

qs.total_logical_reads/qs.execution_count AS [Avg Logical Reads],

qs.max_logical_reads AS [Max Logical Reads],

qs.execution_count AS [Execution Count], qs.creation_time AS [Creation Time]

–,t. AS [Query Text], qp.query_plan AS [Query Plan] — uncomment out these columns if not copying results to Excel

FROM sys.dm_exec_query_stats AS qs WITH (NOLOCK)

CROSS APPLY sys.dm_exec_sql_text(plan_handle) AS t

CROSS APPLY sys.dm_exec_query_plan(plan_handle) AS qp

ORDER BY qs.total_worker_time DESC OPTION (RECOMPILE);

Temps de traitements des jobagents

— temps de traitement des JOBAGENT

with qry as

(select job_id,last_executed_step_id

from msdb.dbo.sysjobactivity

where last_executed_step_id is not null)


job_name, CASE run_status

WHEN 0 THEN ‘Failed’

WHEN 1 THEN ‘Succeeded’

WHEN 2 THEN ‘Retry’

WHEN 3 THEN ‘Cancelled’





convert(date,convert(varchar,run_date)) run_date, Isnull(Substring(CONVERT(VARCHAR, run_time + 1000000), 2, 2) + ‘:’ +

Substring(CONVERT(VARCHAR, run_time + 1000000), 4, 2)

+ ‘:’ +

Substring(CONVERT(VARCHAR, run_time + 1000000), 6, 2),  ») as run_time,

–run_duration, step_name, message

Isnull(Substring(CONVERT(VARCHAR, run_duration + 1000000), 2, 2) + ‘:’ +

Substring(CONVERT(VARCHAR, run_duration + 1000000), 4, 2)

+ ‘:’ +

Substring(CONVERT(VARCHAR, run_duration + 1000000), 6, 2),  ») as run_duration,step_name,message

from qry

cross apply

(select top (qry.last_executed_step_id + 1) as job_name,


run_date, run_time,

run_duration, step_name,

message, step_id

FROM   msdb.dbo.sysjobhistory

INNER JOIN msdb.dbo.sysjobs

ON msdb.dbo.sysjobhistory.job_id = msdb.dbo.sysjobs.job_id

where msdb.dbo.sysjobs.job_id=qry.job_id

order by run_date desc,run_time desc) t

order by job_name,step_id


— Quickly get row counts.
SELECT OBJECT_SCHEMA_NAME(p.object_id) AS [Schema]
, OBJECT_NAME(p.object_id) AS [Table]
, AS [Index]
, p.partition_number
, p.rows AS [Row Count]
, i.type_desc AS [Index Type]
FROM sys.partitions p
INNER JOIN sys.indexes i ON p.object_id = i.object_id
AND p.index_id = i.index_id
WHERE OBJECT_SCHEMA_NAME(p.object_id) != ‘sys’
AND OBJECT_NAME(p.object_id) = ‘APS_MONITOR_ROW_HISTORY’ –Type your table name
ORDER BY [Schema], [Table], [Index]

Voir les contentions sur les SGAM tempdb

Select session_id,
ResourceType = Case
When Cast(Right(resource_description, Len(resource_description) – Charindex(‘:’, resource_description, 3)) As Int) – 1 % 8088 = 0 Then ‘Is PFS Page’
When Cast(Right(resource_description, Len(resource_description) – Charindex(‘:’, resource_description, 3)) As Int) – 2 % 511232 = 0 Then ‘Is GAM Page’
When Cast(Right(resource_description, Len(resource_description) – Charindex(‘:’, resource_description, 3)) As Int) – 3 % 511232 = 0 Then ‘Is SGAM Page’
Else ‘Is Not PFS, GAM, or SGAM page’
From sys.dm_os_waiting_tasks
Where wait_type Like ‘PAGE%LATCH_%’
And resource_description Like ‘2:%’

Recherche des Foreign Keys

Constraint_Name = C.CONSTRAINT_NAME
) PT
WHERE PK.Table_name =’Address’