Quantcast
Channel: MSDN Blogs
Viewing all 29128 articles
Browse latest View live

How to use/create ARM templates for deployments

$
0
0

I was building a lab for the creation of a Hybrid Connection Manager (HCM) link between an Azure App Service and an Azure VM within a VNET.  In most cases, yes, you use HCM to access resources hosted within a another network (not always on Azure), but I didn’t have one of those and therefore created the VNET with a HCM and WEB server within it.  As seen in Figure 1, I have a VNET, a VM, an App Service and all the other required features necessary to support those Azure features.

image

Figure 1, automatic deployment on Azure using ARM templates

Once the original, manual deployment completed, I selected the Automation script link on the Resource Group blade and saved the script.  See Figure 2.

image

Figure 2, download automatic deployment on Azure using ARM templates

There is a very nice description of how to then use these scripts to deploy here.  See the other related articles as well on that same page where it discusses not only deploying using PowerShell, but also using CLI and via the portal.

Once I downloaded the script, I deleted the resource group.

Using the Template deployment feature, then selecting ‘Build your own template in the editor’ I was able to upload the template which I downloaded previously.

image

Figure 3, upload automatic deployment on Azure using ARM templates

Load the TEMPLATE.JSON file into the editor by selecting the Load file button, then click the Save button as shown in Figure 4.

image

Figure 4, upload and save automatic deployment on Azure using ARM templates

After selecting Save, review the output, agree to the Terms and Conditions, then select the Purchase button.  After some modifications:

I was able to successfully deploy, Figure 5.

image

Figure 5, deploy automatic deployment on Azure using ARM templates

Although it did take some time, it completed successfully and was a happy person.

To get a overview of the project I worked on, read the following articles as well.


Troubleshooting App Service Hybrid Connection Manager

$
0
0

I wrote this article “Enable logging for your Hybrid Connection Manager, troubleshooting”, but that was for the deprecated Hybrid Connection feature which uses Azure Biztalk.   You would see any ‘classic’ connections in the ‘Classic hybrid connections’ area on the Hybrid connections as seen in Figure 1.

Figure 1, classic hybrid connections

If you are running classic hybrid connections, migrate to the new Azure App Service Hybrid Connections which are based on Azure Service Bus Relay, read more about that here.

*Note: always be running the most current version of the HCM, see Figure 1, the most current version will be the one in the portal on the Hybrid connections page.  Click on ‘Download connection manager’ to get the package for installation on the endpoint to which you want the Azure Function or Azure App Service to connect to.

There are already some very nice articles about troubleshooting HCM issues, so I will link to them here:

HybridConnection failed to start sb://readiness-sb017.servicebus.windows.net/readiness-function-hc017 One or more errors occurred.    at System.Threading.Tasks.Task.ThrowIfExceptional(Boolean includeTaskCanceledExceptions)
at System.Threading.Tasks.Task.Wait(Int32 millisecondsTimeout, CancellationToken cancellationToken)
at Microsoft.HybridConnectionManager.HybridConnectionService.StartEndpoint(HybridConnectionElement hybridConnectionElement).

HybridConnectionManager Trace: System.ObjectDisposedException: Cannot access a disposed object.
Object name: 'System.Net.Sockets.NetworkStream'.
at System.Net.Sockets.NetworkStream.EndRead(IAsyncResult asyncResult)
at System.Threading.Tasks.TaskFactory`1.FromAsyncTrimPromise`1.Complete(TInstance thisRef, Func`3 endMethod, IAsyncResult asyncResult, Boolean requiresSynchronization)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.HybridConnectionManager.Util.<AsyncPipeStream>d__0.MoveNext().

HybridConnectionManager Trace: System.ObjectDisposedException: Cannot access a disposed object.
Object name: 'System.Net.Sockets.NetworkStream'.
at System.Net.Sockets.NetworkStream.EndRead(IAsyncResult asyncResult)
at System.Threading.Tasks.TaskFactory`1.FromAsyncTrimPromise`1.Complete(TInstance thisRef, Func`3 endMethod, IAsyncResult asyncResult, Boolean requiresSynchronization)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.HybridConnectionManager.Util.<AsyncPipeStream>d__0.MoveNext().

I wrote some other Hybrid Connection articles here:

To get a overview of the project I worked on, read the following articles as well.

Creating an Azure App Service Hybrid Connection

$
0
0

This is such a simple yet powerful tool to use, easy to implement and does some real hard core, complicated stuff in the back ground to get the connectivity to work.  Similar instructions can be found here “Azure App Service Hybrid Connections”.

I wrote this article about connecting an Azure App Service to VM in a VNET using Point-to-Site (P2S):

In this scenario, I again connect an Azure App Service to an Azure VM in a VNET, I did this because I have no on-premise network to connect to.  However, I am certain it is not any more complicated.

The steps required to configure a Hybrid Connection are:

  • Configure the endpoint which the App Service will contect to
  • Install and configure the Hybrid Connection Mananger on the server being connected to
  • Test it out

Configure the endpoint which the App Service will contect to

First access the Azure App Service in the portal and select the Networking link as shown in Figure 1.

image

Figure 1, configure azure app service hybrid connection, app service 

Notice in Figure 2 that as I selected a Standard SKU that I get 25 connections.  The number of connections are based on the selected SKU and can be seen in detail here.  Initially, I thought the connection limit meant that this was the maximum of concurrent connections which my Azure App Service could have open with the configured backend server, this turned out to be wrong.  Based on my SKU, I can create a Hybrid Connection with 25 different backends.

image

Figure 2, configure azure app service hybrid connection, app service

Also in Figure 2, see in the red square “download connection manager” that this is the location to download the Hybrid Connection Manager installation package for installation on the onsite or other dedicated server to which you want to create the connection.

TAs illustrated in Figure 3, he name of the Hybrid connection can by anything, just make it describe the connection so in the future you know what the connection is for.  The Endpoint Host should match the name of the server (NETBIOS) to which you are connecting.  I call out NETBIOS which is converse to FQDN as I have read using the FQDN causes some problems if you do not have a DNS server, also do not use an IP address.  Although I have read those possible issues, I have not done it to see for myself.  But I think you can make the configuration simply even with those contraints.

image

Figure 3, configure azure app service hybrid connection, app service

There are some ports which the hybrid connection uses, so avoid: 9350-9354, 5671, 80 and 443.  I have not seen any document stating that any other port is restricted. I wrote an article here that explains more about those ports usage.  That article is about HCM with Biztalk which is depricating, but the port descriptions still ally as far as I know.

Lastly, it is a good idea to create the Service Bus in the same region as the App Service.  Select OK and then you will see the newly created hybrid connection in the portal, as seen in Figure 4.

image

Figure 4, configure azure app service hybrid connection, app service

That is all from an App Service perspective next, you need to install the Hybrid Connection Manager on the machine you want the App Service to connect to.

Install and configure the Hybrid Connection Mananger on the server being connected to

Download the Hybrid Connecton Manager package I mentioned previously, shown in Figure 2 and once installed open it.  You will find it in the menu similiar to that shown in Figure 5.

image

Figure 5, configure azure app service hybrid connection, onsite or backend server

Once opened, you should see a window similair to that shown by Figure 6.

image

Figure 6, configure azure app service hybrid connection, onsite or backend server

Click on the ‘Add a new Hybrid Connection’ and then login to the Azure Subscription which contains the App Service you configured using the previous steps (Figures 1-4).

Select the subscription from the drop-down, which will list the Hybrid Configurations and select the one you desire, then Save it.  The result is something similar to Figure 7.

image

Figure 7, configure azure app service hybrid connection, onsite or backend server

Once the save completes, return to the portal and view the Hybrid Connection, see Figure 8, and you notice the status has also changed to Connected.

image

Figure 8, configure azure app service hybrid connection, app service

Now the connectivitiy should be functional between the App Service and the Azure VM in the VNET.

Test it out

As shown in Figure 9, before the Hybrid Connection Manager was configured on the backend VM (Figure 7), but after the Hybrid Connection was configured for the App Service (Figure 4), I was able to get a successful TCPING response.  I interpret this to mean that TCPING only checks if the port is open and not that the machine is responding to the tcp ping.  This was a new learning.

I also checked using a CURL, and it failed as expected.

image

Figure 9, testing, troubleshooting azure app service hybrid connection, app service

After I installed the Hybrid Connection Manager on the backend server (Figure 7) and configured it, the CURL worked, although not as expected (becuase there was some content in the index.html file), but it did work and without an error, Figure 10.  The TCPING kept working, as expected too.

image

Figure 10, testing, troubleshooting azure app service hybrid connection, app service

Instead of using CURL and TCPING, I am going to make an HttpClient call from an Azure Function via the same Hybrid Connection.  Read about how I do that here “How to Azure Function App with Hybrid Connection”

To get a overview of the project I worked on, read the following articles as well.

CHALLENGE #1 – Auto-fill company information on the customer card

$
0
0

Challenge

As a new customer is entered in Dynamics 365 Business Central, the user can decide to enter a domain name instead of the name, which leads to the system looking up the information for the company associated with this domain name from a Web Service and filling out the remaining fields on the customer card with information obtained from the Web Service.

To complete this challenge, you will need

  • A Dynamics 365 Business Central Sandbox Environment
  • Visual Studio Code with the AL Extension installed
    • Azure VMs will have VS Code pre-installed
  • An API Key from http://www.fullcontact.com

Expected result

Steps

  • Create an empty app
  • Create a page extension for the customer card
  • On the OnAfterValidate trigger on the Name field, check whether the entered value is a domain name
  • Ask the user whether he wants to lookup information about the company associated with this domain name
  • Call the fullcontact Web API and assign field values

Hints

  • In VS Code, use Ctrl+Shift+P and type AL GO and remove the customerlist page extension
  • Use the tpageext snippet
  • Use EndsWith to check whether the name is a domain name
  • Use the Confirm method to ask whether the user want to download info
  • Use HttpClient to communicate with the Web Service
  • Use Json types (JsonObject, JsonToken, JsonArray and JsonValue) to extract values from the Web Service result

Cheat Sheets

 

Happy coding

Freddy Kristiansen
Technical Evangelist

How to Azure Function App with Hybrid Connection

$
0
0

On my path to create this article, I wrote numerous other along the way.  To get a overview of the project I worked on, read the following articles as well.

Basically, my project was to create an ARM template that created the features required to test the Hybrid Connection manager.

In this article, I will discuss the connection of an Azure Function to an Azure VM in a VNET (also in Azure) using the App Service Hybrid Connection manager, here are the steps:

  • Create the Azure Function (App Service / Dedicated, I.e. not Consumption)
  • Configure App Service Hybrid Connection on the Azure Function
  • Configure the Hybrid Connection Manager (HCM) on the Azure Virtual Machine

Create the Azure Function (App Service / Dedicated, I.e. not Consumption)

Create a new App Service Plan Azure Function from within the Azure portal.  Select the Function similar to that seen in Figure 1.

image

Figure 1, how to create an Azure Function App, Hybrid Connection

Then, as seen in Figure 2, give the Azure Function a name, and very important, as of writing this article, it is only possible to configure a Hybrid Connection when the function is running on an App Service Plan Hosting Plan.  App Service Plan will remain within the same tenant, while it is possible that Consumption based Azure Functions run in multiple tenants all at the same timeimage

Figure 2, how to create an Azure Function App, Hybrid Connection

Then save the Azure Function.

Create a simple HTTP Trigger, Figure 3.

image

Figure 3, how to create an Azure Function, Hybrid Connection

Then you can test the Azure Function using, for example CURL, as seen in Figure 4.

image

Figure 4, how to test an Azure Function, Hybrid Connection

Configure App Service Hybrid Connection on the Azure Function

Click on the Function App –> the Platform features tab –> the Configure your hybrid connection endpoints, As seen in Figure 5.

image

Figure 5, how to configure an Azure Function, Hybrid Connection, configure endpoints

As seen in Figure 6, give the connection a name, the enpoint should be the NETBIOS name of the endpoint which this Azure Function App will connect to and on which port it will make the connection.

image

Figure 6, how to configure an Azure Function, Hybrid Connection, configure endpoints

If you already have connection enpoints, then you can reuse an existing Service Bus.  Note that there are different SKUs of Service Bus so if you reuse the same one, make sure it is at a SKU which supports the throughput.  See here.

Once the Hybrid Connection endpoint is configured, you will see something similar to Figure 7, with a status of Not Connected.

 

image

Figure 7, how to configure an Azure Function, Hybrid Connection, configure endpoints

Configure the Hybrid Connection Manager (HCM) on the Azure Virtual Machine

You can download the Hybrid Connection Mananger from the same page where you configured the Hybrid Connection endpoint.  Selecting “Configure your hybrid connection endpoints” as seen previously in Figure 5 is the place to down load the installation package.

Once you have it, install is on the machine to which you want the Azure Function App to connect to.  After the installation, you are pormpted to enter your Azure credentials, use the credentials which has access to the Azure Function App you created.  After authentication, select the correct subscription and then you will see the Hybrid Connection you just created.  See Figure 8.

image

Figure 8, how to configure an Azure VM or HOST, Hybrid Connection, configure endpoints

Once configured, navigate back to the Azure Function App and look at the status of the Hybrid Connection, figure 9.  If all is ok, then the status will show as Connected and you will be able to connect to the VM using the configured port.

image

Figure 9, how to configure an Azure VM or HOST, Hybrid Connection, configure endpoints

An example of some code to use for the Azure Function, see here also for a discussion on optimal use of connections and coding patterns “Managing Connections”.

using System.Net;
using System.Net.Http;

private static HttpClient httpClient = new HttpClient();

public static async Task<HttpResponseMessage> 
            Run(HttpRequestMessage req, TraceWriter log)
{
    try {
        var response = await httpClient.GetAsync("http://HCM200:7071");
        return req.CreateResponse(HttpStatusCode.OK, "Response: " + response);
    }
    catch (Exception ex) {
        return req.CreateResponse(HttpStatusCode.BadRequest, "Error: " + ex.Message); 
    }
}

How to deploy to Azure using an ARM template with PowerShell

$
0
0

I wrote this article here “How to use/create ARM templates for deployments” where I show how I generated the ARM template for multiple Azure features existing in a Resource Group.  I then deploy all features using the ARM template using the Template deployment blade in the portal.  This article explains how to run the same ARM template from PowerShell.  It is actuall quit simple.

When you create the Automation script via the portal, the extraction also creates a PS1 file named deploy.ps1.  This contains the script that performs the deployment using the JSON files, which were also created.  Simply right-click on the deploy.ps1 file, as seen in Figure 1 and select “Run with PowerShell”.

image

Figure 1, deploy ARM template using PowerShell

One note about taking the “Run with PowerShell” approach is that the window closes if there is an exception and upon completion, so you don’t get a chance to see the details of either.  You can also select the ‘Edit’ menu item and run it from the PowerShell IDE as seen in Figure 2.

image

Figure 2, deploy ARM template using PowerShell

If all goes well, then you will be prompted for the the necessary information and the execution will result in the proper deployment of the resources.  An output of my ARM template, which was extracted from this resource group here, is as shown in Figure 3.

image

Figure 3, deploy ARM template using PowerShell

cmdlet deploy.ps1 at command pipeline position 1
Supply values for the following parameters:
subscriptionId: 
resourceGroupName: READINESS-HC017-RG
deploymentName: READINESS017
Logging in...

Account          : 
SubscriptionName : 
SubscriptionId   : 
TenantId         : 
Environment      : AzureCloud

Selecting subscription 

Name               : [, ]
Account            : 
Environment        : AzureCloud
Subscription       : 
Tenant             : 
TokenCache         : Microsoft.Azure.Commands.Common.Authentication.AuthenticationStoreTokenCache
VersionProfile     : 
ExtendedProperties : {}

Registering resource providers
Registering resource provider microsoft.compute
ProviderNamespace : Microsoft.Compute
RegistrationState : Registered
ResourceTypes     : {availabilitySets, virtualMachines, virtualMachines/extensions, virtualMachineScaleSets...}
Locations         : {East US, East US 2, West US, Central US...}
ZoneMappings      : 

Registering resource provider microsoft.network
ProviderNamespace : Microsoft.Network
RegistrationState : Registered
ResourceTypes     : {virtualNetworks, publicIPAddresses, networkInterfaces, loadBalancers...}
Locations         : {West US, East US, North Europe, West Europe...}
ZoneMappings      : 

Registering resource provider microsoft.storage
ProviderNamespace : Microsoft.Storage
RegistrationState : Registered
ResourceTypes     : {storageAccounts, operations, locations/asyncoperations, storageAccounts/listAccountSas...}
Locations         : {East US, East US 2, West US, West Europe...}
ZoneMappings      : 

Registering resource provider microsoft.web
ProviderNamespace : Microsoft.Web
RegistrationState : Registered
ResourceTypes     : {sites/extensions, sites/slots/extensions, sites/instances, sites/slots/instances...}
Locations         : {Central US, North Europe, West Europe, Southeast Asia...}
ZoneMappings      : 

Resource group READINESS-HC017-RG does not exist. To create a new resource group, please enter a location.
resourceGroupLocation: West Europe
Creating resource group READINESS-HC017-RG in location West Europe
ResourceGroupName : READINESS-HC017-RG
Location          : westeurope
ProvisioningState : Succeeded
Tags              : 
TagsTable         : 
ResourceId        : /subscriptions//resourceGroups/READINESS-HC017-RG

Starting deployment...

DeploymentName          : template
CorrelationId           : ac008d7b-c64a-46ba-8fa3-8112260635d5
ResourceGroupName       : READINESS-HC017-RG
ProvisioningState       : Succeeded
Timestamp               : 21.03.2018 09:16:56
Mode                    : Incremental
TemplateLink            : 
TemplateLinkString      : 
DeploymentDebugLogLevel : 
Parameters              : 
  {[sites_READINESS_HCM001_name, Microsoft.Azure.Commands.ResourceManager.Cmdlets.SdkModels.DeploymentVariable], 
   [virtualMachines_HCM001_name, Microsoft.Azure.Commands.ResourceManager.Cmdlets.SdkModels.DeploymentVariable], 
   [virtualMachines_WEB001_name, Microsoft.Azure.Commands.ResourceManager.Cmdlets.SdkModels.DeploymentVariable], 
   [serverfarms_READINESS_HCM_ASP_name, Microsoft.Azure.Commands.ResourceManager
         .Cmdlets.SdkModels.DeploymentVariable]...}
ParametersString        : 
                          Name             Type                       		Value     
                          ===============  =========================  		==========
                          sites_READINESS_HCM001_name  String                   READINESS-HCM001
                          virtualMachines_HCM001_name  String                   HCM001    
                          virtualMachines_WEB001_name  String                   WEB001    
                          serverfarms_READINESS_HCM_ASP_name  String            READINESS-HCM-ASP
                          networkInterfaces_hcm001446_name  String              hcm001446 
                          networkInterfaces_web001454_name  String              web001454 
                          publicIPAddresses_HCM001_ip_name  String              HCM001-ip 
                          publicIPAddresses_WEB001_ip_name  String              WEB001-ip 
                          config_web_name  String                     		web       
                          networkSecurityGroups_HCM001_nsg_name  String         HCM001-nsg
                          networkSecurityGroups_WEB001_nsg_name  String         WEB001-nsg
                          virtualNetworks_READINESS_HCM001_RG_vnet_name  String  READINESS-HCM001-RG-vnet
                          storageAccounts_readinesshcm001rgdiag701_name  String  readinesshcm001rgdiag701
                          subnets_default_name  String                          default   
                          securityRules_default_allow_rdp_name  String          default-allow-rdp
                          securityRules_default_allow_rdp_name_1  String        default-allow-rdp
                          hostNameBindings_readiness_hcm001.azurewebsites.net_name  
                              String  +.azurewebsites.net
                          virtualMachines_HCM001_id  String                     
                                                     /subscriptions//resourceGroups/READ
                                                     INESS-HCM001-RG/providers/Microsoft.Compute/disks/
                                                     HCM001_OsDisk_1_1189fc07b347421b878b9632040d5414
                          virtualMachines_WEB001_id  String                     
                                                     /subscriptions/2/resourceGroups/READ
                                                     INESS-HCM001-RG/providers/Microsoft.Compute/disks/
                                                     WEB001_OsDisk_1_f29722f8755744eb8d17074eb05df182
                          
Outputs                 : 
OutputsString           : 

NOTES:

  • It states in the notes of the deploy.ps1 file that if the parameters.json file is found that it will be used to fill the required parameters values.  By default, in my case all the values had a null value.  If the parameters.json file is not found then values are collected from the template.json file which did have the values.  I simply removed the parameters.json file from the directory which I ran the deploy.ps1 fil.  Here is the exception:
New-AzureRmResourceGroupDeployment : 09:29:43 - Error: Code=InvalidDeploymentParameterValue; 
  Message=The value of deployment parameter 
   hostNameBindings_readiness_hcm001.azurewebsites.net_name is null. 
  Please specify the value or use the parameter reference. See 
    https://aka.ms/arm-deploy/#parameter-file for details.
     At C:UsersbenperkDownloadsFullTemplatedeploy.ps1:104 char:5
     +     New-AzureRmResourceGroupDeployment -ResourceGroupName $resourceGr ...
     +     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [New-AzureRmResourceGroupDeployment], Exception
    + FullyQualifiedErrorId : 
       Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDeploymentCmdlet

And here is the content from deploy.ps1 file which made the claritifaction:

 .SYNOPSIS
    Deploys a template to Azure

 .DESCRIPTION
    Deploys an Azure Resource Manager template

 .PARAMETER subscriptionId
    The subscription id where the template will be deployed.

 .PARAMETER resourceGroupName
    The resource group where the template will be deployed. Can be the name of an 
     existing or a new resource group.

 .PARAMETER resourceGroupLocation
    Optional, a resource group location. If specified, will try to create a new 
    resource group in this location. 
    If not specified, assumes resource group is existing.

 .PARAMETER deploymentName
    The deployment name.

 .PARAMETER templateFilePath
    Optional, path to the template file. Defaults to template.json.

 .PARAMETER parametersFilePath
    Optional, path to the parameters file. Defaults to parameters.json. If file is not 
    found, will prompt for parameter values based on template.
  • Initially, i needed to execute Set-ExecutionPolicy to be able to run this, even as an Administrator

That was not so hard and worked as expected.

To get a overview of the project I worked on, read the following articles as well.

Jak na .NET Standard knihovny a generování NuGet balíčků

$
0
0

Tvorba NuGet balíčků sice v minulosti nebyla žádná věda, nicméně pokud chtěl vývojář dostat balíček na co největší množství platforem, musel je targetovat pro různé verze a s tím bylo spojeno plno práce v rámci samotného vývoje. Díky .NET Standardu přichází úleva a možnost vytvořit funkční multiplatformní balíček přímou a jednoduchou cestou.

Úvodní slovo k .NET Standardu

V současné době existuje více implementací .NET Frameworku. Mezi nejznámější patří tradiční .NET Framework 4.x, multiplatformní .NET Core 2.x nebo například Mono 5.4. Všechny tyto frameworky v současné době implementují určitou verzi .NET Standardu, čímž se zavazují splňovat všechny její specifikace.

Podle tabulky níže si lze udělat přesnější představu o tom, jaké verze frameworků implementuje která verze .NET Standardu. Z tabulky je vidět, že například .NET Framework 4.6 implementuje .NET Standard 1.3. Je to nejvyšší podporovaná verze a zároveň tedy implementuje i vše, co je v .NET Standard 1.2, 1.1 i 1.0.

Verze .NET Standardu

Jakou verzi .NET Standardu targetovat?

Pokud se jako vývojář rozhodnu, že bych chtěl vyrobit multiplatformní balíček, musím si zvolit vhodnou verzi .NET Standardu. Tato volba je zásadní.

Vyšší verze = více API = menší cílová skupina

Rozhodnu-li se pro nejvyšší verzi .NET Standard 2.0, bude můj balíček moci používat více API, než u starších verzí. Zároveň je třeba si uvědomit, že takovou sadu API splňují jen novější verze cílových frameworků (např.: .NET Framework 4.6.1 a .NET Core 2.0). Znamená to, že vývojář, který píše webovou aplikaci targetující .NET Core 1.0 už takový balíček nebude moci použít.

Nižší verze = méně API = větší cílová skupina

Ideální je tedy zvolit co nejnižší verzi .NET Standardu, která je pro mé účely vybavena vším, co můj balíček bude potřebovat. Nižší verzi .NET Standardu splňují již nižší verze frameworků, čímž takový balíček bude moci použít mnohem více vývojářů.

Chcete-li tedy vytvořit jednoduchý balíček, který nepoužívá žádné extra funkce, pak nemá smysl targetovat nejnovější verze .NET Standardu.

Praktická ukázka knihovny

Praktickým příkladem zjednodušení může být knihovna (NuGet balíček) FioSDK, který jsem v minulosti udržoval a bojoval s targetováním pro různé verze frameworků. Obskurní přístupy jsou dnes díky .NET Standardu minulostí:

Targetování v době, když tu nebyl .NET Standard

Takto transformovaný NuGet balíček targetuje .NET Standard 1.3 a může být tudíž použit pro vývoj aplikací nad .NET Core od verze 1.0, .NET Frameworku 4.6 ale i pro vývoj s Mono, Xamarinem nebo pro univerzální aplikace (UWP).

Vytváření NuGet balíčků

Pokud se rozhodnu udělat C# knihovnu, ze které budu dělat NuGet balíček, pak v první řadě musím knihovnu natargetovat na vybranou verzi .NET Standardu na úrovni projektu:

Target Framework v Rider

Výsledkem klikání v libovolném GUI je úprava v projektovém souboru:

<Project Sdk="Microsoft.NET.Sdk">
    <PropertyGroup>
        <TargetFramework>netstandard1.3</TargetFramework>
    </PropertyGroup>
</Project>

Hotovo. Zkompilovatelný projekt už nyní mohu vypublikovat do podoby NuGet balíčku.

Vytvoření NuGet balíčku s dotnet CLI

Existuje více možností jak vytvořit NuGet balíček. V článku popíši velmi elegantní řešení pomocí nástroje dotnet CLI. Pro použití nástroje je potřeba nainstalovat .NET Core SDK, jehož součástí je právě i dotnet CLI. Po instalaci SDK je možné ověřit funkčnost zavoláním příkazu dotnet v CMD nebo terminálu.

Ukázka nástroje dotnet v terminálu

Jestliže mám k dispozici dotnet, stačí se nascopovat na úroveň projektu a zavolat nejprve příkaz dotnet build pro sestavení projektu a následně příkaz dotnet pack pro vytvoření NuGet balíčku. Ten je zároveň i velmi snadno konfigurovatelný:

Možnosti dotnet pack

Nastavení informací o NuGet balíčku

Protože budeme chtít připojit k NuGet balíčku i užitečné informace, jako například název, verzi, popis, ikonu atd., musíme připravit specifikaci. "Po staru" bychom mohli jít cestou nuspec souboru, nicméně snazší je informace rovnou zahrnout do projektového souboru. Všechny údaje k NuGet balíčku jsou označovány jako metadata a jejich kompletní přehled je dostupný online v MSDN dokumentaci.

Kompletní projektový soubor (csproj) pak může vypadat například takto:

<Project Sdk="Microsoft.NET.Sdk">
    <PropertyGroup>
        <TargetFramework>netstandard1.3</TargetFramework>
        <PackageId>FioSdk</PackageId>
        <Version>2.0.0</Version>
        <Description>FIO Banka SDK for C#</Description>
        <Copyright>© 2016 - 2018</Copyright>
        <Language>en-US</Language>
        <Authors>Miroslav Holec</Authors>
        <Title>FIO Banka SDK for C#</Title>
        <PackageIconUrl>https://holecdrive.blob.core.windows.net/nuget/fio-sdk.png</PackageIconUrl>
        <PackageProjectUrl>https://github.com/mholec/fio-sdk-csharp</PackageProjectUrl>
        <PackageLicenseUrl>https://github.com/mholec/fio-sdk-csharp/blob/master/LICENSE</PackageLicenseUrl>
        <PackageTags>fio, sdk, api, banka</PackageTags>
        <GeneratePackageOnBuild>true</GeneratePackageOnBuild>
    </PropertyGroup>
    <ItemGroup>
        <PackageReference Include="Newtonsoft.Json" Version="9.0.1"/>
    </ItemGroup>
</Project>

Již na první pohled je zřejmá jedna výhoda, kterou jsou závislosti na jiných NuGet balíčkách (nemusíme tyto závislosti uvádět dvojmo v nuspec). Zároveň bych upozornil na element GeneratePackageOnBuild, který mi zajišťuje, že výstupem každého buildu bude zároveň i nový NuGet balíček. Po sestavení projektu na build serveru tak mohu vykopírovat nupkg soubory do artefaktů a připravit je na release.

Vytvořený NuGet balíček (soubor nupkg) najdeme ve složce bin, kde je i samotná složka s jeho obsahem (včetně dll).

Vytvořený nupkg balíček

Část záznamu z přednášky, kde o .NET Standardu mluvím

Závěrečné poznámky

Popsaný způsob tvorby NuGet balíčků není rozhodně jediný, jen si myslím, že může být pro mnoho vývojářů nejsnazší cestou, jak s balíčkováním začít. V některém z dalších článků popíšu, jak si nastavit CI buildy a Continuous Deployment.

 

Miroslav Holec

CHALLENGE #2 – Geocode customers (LEVEL 3)

$
0
0

Scenario

As customers are created or modified, their addresses are automatically geocoded and their location are stored in a set of new fields on the Customer table. A factbox showing the current location of the customer on a map is added to the Customer Card and links to open a large map or get directions are available as well. As a bonus, you can create an action, which will display all customers within a certain range on a map.

To complete this challenge, you will need

  • A Dynamics 365 Business Central Sandbox Environment
  • Visual Studio Code with the AL Extension installed
    • Azure VMs will have VS Code pre-installed
  • A BingMaps API Key from https://www.bingmapsportal.com (Public WebSite or Dev/Test key)

Expected result

Steps

  • Create an empty app
  • Create a table extension for the customer table and add fields
  • Create a page extension for the customer card and add fields
  • On the OnBeforeModify and OnBeforeInsert triggers on the Customer table, geocode the customer address using BingMaps API
  • Create a CardPart page showing a map of the geocoded position
  • Create an action opening a browser with all customers as shared places

Hints

  • In VS Code, use Ctrl+Shift+P and type AL GO and remove the customerlist page extension
  • Use the ttableext, the tfield and the tkey snippets
  • Use the tpageext and the tpagefield snippets
  • Use HttpClient to communicate with the Web Service and use Json types (JsonObject, JsonToken, JsonArray and JsonValue) to extract values from the Web Service result
  • Use tpage snippet and add usercontrol with the WebPageViewer. Use https://www.bing.com/maps/embed-a-map to see the html you can use In the control.
  • Add location of all customers to the sp parameter as explained here: https://msdn.microsoft.com/en-us/library/dn217138.aspx

Cheat sheets

 

Happy coding

Freddy Kristiansen
Technical Evangelist


CHALLENGE #3 – Include twitter feed on the customer card (LEVEL 2)

$
0
0

Scenario

To form the basis for monitoring customer sentiment, opportunities, or issues, enter a twitter handle for your customers and show a twitter timeline for the customer on the customer card.

To complete this challenge, you will need

  • A Dynamics 365 Business Central Sandbox Environment
  • Visual Studio Code with the AL Extension installed
    • Azure VMs will have VS Code pre-installed

Expected result

Steps

  • Create an empty app
  • Create a table extension for the customer table and add field for twitter handle
  • Create a page extension for the customer card and add field for twitter handle
  • Create Custom Control for showing Twitter feed
  • Create a CardPart page showing a twitter feed
  • Add the Twitter Feed card part to customer card

Hints

  • In VS Code, use Ctrl+Shift+P and type AL GO and remove the customerlist page extension
  • Use the ttableext and the tfield snippets to create the table extension
  • Use the tpageext and the tpagefield snippets to create the page extension
  • Use the tcontroladdin snippet to create the custom control
  • Use the tpage snippet to create the cardpart page
  • Add the Twitter feed part to the factboxes area.

Cheat sheets

  • Create an empty app
  • Create a table extension
  • Create a page extension
  • Create a Twitter Feed custom control
  • Create a Card Part with the Twitter Feed custom control
  • Add card part to customer page

 

Happy coding

Freddy Kristiansen
Technical Evangelist

Debug a Dynamics 365 for Finance and Operations on-premises instance without Visual Studio

$
0
0

In this post I'm going to explain how to debug an error occurring in Dynamics 365 for Finance and Operation on-premises - directly in the on-premises environment, where Visual Studio isn't available, by using a free tool called WinDbg.

This approach gives a fast way to catch exceptions occurring in the environment and identify the call stack, more detailed error message (for example to see inner exceptions) and to see values for running variables at the time of the exception. You can use this approach not only for debugging the AOS itself, but actually for any component in Windows which is running .NET type code - for example if SSRS was throwing an exception, you can do the same thing to debug SSRS itself.

It does not give a full X++ debugging experience as you would normally have using Visual Studio with the Dynamics dev tools installed - I will be making another post soon explaining how to hook up Visual Studio to debug your on-premises instance to debug.

Overview

WinDbg is a very powerful debugging tool and can be used in many different scenarios - for example debugging an exception occurring in any Windows software or analyzing memory dumps (also known as crash dumps) from a Windows process.

In this document we'll look at one particular scenario to give an introduction to the tool and how it can be helpful in conjunction with Dynamics 365 for Finance and Operations on-premises to troubleshoot exceptions.

The example scenario here is:
- I have an external application trying to call into Finance and Operations web services
- The call is failing with "Unauthorized" in the calling application
- There is no error in the AD FS event log - AD FS is issuing a token fine, but the AOS is denying the call.
- I want to know why I am "Unauthorized" because it seems AOS should be allowing me

Prepare

First install WinDbg, this is available from the Windows SDK here

Note: there is a newer version of WinDbg currently in preview available in the Windows Store here, but my post here is only dealing with the old current released version.

Most of the install tabs you can click next-next - but when choosing which options to install, uncheck everything except the "Debugging tools for Windows" as shown below:

Once the installer completes you will find WinDbg on your Windows start menu - both x64 and x86 versions (and ARM and ARM64) will be installed. The rule for debugging .NET code with WinDbg is to match the version of WinDbg to the architecture of the process - 32 bit process, 32 bit WinDbg and 64 bit process, 64 bit WinDbg. As we are going to debug the AOS which is 64 bit, we'll need to open WinDBgx64 - MAKE SURE to run as Administrator, otherwise it won't let you attach to the process.

In a typical on-premises environment there w3ill be 3 AOS instances - when we're debugging we're not sure which of the 3 AOS we'll hit, so we want to turn off the other two, then we know everything will hit the remaining one, and we can debug that one. There are two options to do that:
1. Shut down the other two AOS machines in Windows.
2. From SF explorer, disable the AOS application for the other two AOS - if you take this route then you need to check that AXService.exe has actually stopped on both of those AOS machines in task manager - because I've found that it doesn't always stop immediately, it'll sit there for a while and requests will continue to go to them.

Debug

Now we have the tool installed we're ready to debug something. In WinDbg go to "File"->"Attach to process..", a dialog will open showing all the current running processes on that machine - select "AXService.exe" and click ok. It's easier to find in the list if you select the "by executable" radio button, which will alphabetize the list.

WinDbg is a command line debugger, at the bottom of the Windows there is a box where you can enter commands for it to execute - that's primarily how you get it to do anything.

As we're going to debug .NET code, we'll first load an extension for WinDbg which will help us to decode .NET related information from the process. This extension exists on any machine which has the .NET framework installed. Enter this command and hit enter:

.load C:WindowsMicrosoft.NETFramework64v4.0.30319sos.dll

Next we're going to tell WinDbg that when a .NET exception occurs it should stop the process on a breakpoint, because we don't have source code available in an on-premises environment, the easy way for us to set a breakpoint is to base it on exceptions. The command for WinDbg to break on exception is "sxe" and the exception code is "e0434352", we always use the same exception code here, because that is the native Windows code representing all .NET type exceptions.

sxe e0434352

Now we need to let the process run again - because when we attached to the process WinDbg automatically put a "break" on it - we can tell if the process is running or not - if it's running it says "Debuggee is running.." in the command prompt. To let the process run again enter "g" meaning go.

g

After entering "g" you see it is running again:

Ok now we're ready to reproduce our issue, so I'm just going to my client application and making the error happens, then in WinDbg I see this. Note that the client application will seem to "hang", this is because WinDbg is stopping the AOS on a breakpoint and not letting it complete the request:

We can run a command to show us the exception detail "!pe". This command comes from the sos.dll extension we loaded earlier, the use of "!" denotes it's coming from an extension. Note that WinDbg is case sensitive on everything you enter.

Here I can see the exception from within the AOS - it's hard to see in the screenshot, so here's the full text:

0:035> !pe
Exception object: 000002023b095e38
Exception type: System.IdentityModel.Tokens.SecurityTokenInvalidAudienceException
Message: IDX10214: Audience validation failed. Audiences: 'https://ax.d365ffo.zone1.saonprem.com/namespaces/axsf/'. Did not match: validationParameters.ValidAudience: 'null' or validationParameters.ValidAudiences: 'https://ax.d365ffo.zone1.saonprem.com, 00000015-0000-0000-c000-000000000000, https://ax.d365ffo.zone1.saonprem.com/'
InnerException:
StackTrace (generated):
StackTraceString:
HResult: 80131501

I'm not going explain the example error message in this post - but if you're interested it is explained here

Next we can see the call stack leading to this exception by running "!clrstack", it's worth noting that the first time you run this command on a machine where it's the first time you've used WinDbg it might spin for a couple of minutes - that happens because WinDbg is looking for symbols - after the first time it'll run straight away. This command is useful to understand what the AOS was trying to do when the exception occurred - its not necessary to have all of the source code to make sense of the call stack - most times I am looking at this I am simply reading the method names and making an educated guess about what it was doing based on the names (of course it's not always that simple, but often it is).

!clrstack

Last command for this post, is to show the running .NET variables relating to the call stack we just saw. This command is useful, to understand what values the AOS was running with - similar to my approach with !clrstack, I am simply looking through this list of human readable values - something I recognize - for example if it was an exception in a Purchase order process I'd be looking for something which looks like a vendor account number or PurchId. This is particularly useful when the value the AOS is running with, isn't the value that you expect it should have been running with.

!dso

That's all for now, happy debugging!

Marketplace customers will receive an error when trying to acquire Paid Extensions and subscriptions. – 05/16 – Mitigating

$
0
0

Update: Wednesday, May 16th 2018 13:12 UTC

Our DevOps team continues to investigate issues with MarketPlace. Root cause is understood and a mitigation has been identified, we are working on rolling out the hotfix in the next 2 hrs. The problem began at 09:00 UTC May 16th 2018.

  • Next Update: Before Wednesday, May 16th 2018 15:15 UTC

Sincerely,
Niall


Initial Update: Wednesday, May 16th 2018 12:10 UTC

We're investigating an issue where Marketplace customers will receive an error when trying to acquire Paid Extensions and subscriptions.

  • Next Update: Before Wednesday, May 16th 2018 13:10 UTC

Sincerely,
Niall

Azure AI Gallery enables developers and data scientists to share their analytics solutions.

$
0
0

Azure AI Gallery is a community-driven site for discovering and sharing solutions. Learn how to contribute.

image

The Gallery has a variety of resources that you can use to develop your own analytics solutions.

Students can try Azure Machine Learning for free. No credit card or Azure subscription is required. http://aka.ms/azure4students

What can I find in the Gallery?

The Azure AI Gallery contains a variety of resources that you can use to develop your own analytics solutions.

  • Experiments - The Gallery contains a wide variety of experiments that have been developed in Azure Machine Learning Studio. These range from quick proof-of-concept experiments that demonstrate a specific machine learning technique, to fully-developed solutions for complex machine learning problems.
  • Jupyter Notebooks - Jupyter Notebooks include code, data visualizations, and documentation in a single, interactive canvas. Notebooks in the Gallery provide tutorials and detailed explanations of advanced machine learning techniques and solutions.
  • Solutions - Quickly build Azure AI Solutions from solution templates, reference architectures and design patterns. Make them your own with the included instructions or with a featured partner.
  • Projects - Explore projects contributed by experienced members of the data science community. A project is a collection of scripts, notebooks, and/or data designed to support the everyday work of data scientists.
  • Tutorials - A number of tutorials are available to walk you through machine learning technologies and concepts, or to describe advanced methods for solving various machine learning problems.
  • Models Explore a growing collection of machine learning models that can be utilized in building projects and solutions.
  • Custom Models – Download these custom Azure Machine Learning modules and deploy them into your experiments. Custom modules expand the capabilities of Azure Machine Learning Studio to allow you to develop even more advanced predictive analytics solutions.

These basic Gallery resources can be grouped together logically in a couple different ways:

  • Collections - A collection allows you to group together experiments, APIs, and other Gallery items that address a specific solution or concept.
  • Industries - The Industries section of the Gallery brings together various resources that are specific to such industries as retail, manufacturing, banking, and healthcare.

Simple and robust way to operationalise Spark models on Azure

$
0
0

Let's operationalise a Spark regression model

Lately I've been doing a lot of work with Azure DataBricks which is a superb Spark-based analytics platform fully integrated with Azure. It gives you everything that Open Source Spark does and then some. I've been especially enjoying the effortless ways to move large datasets around and the ease of MLlib for my AI-projects.

One of the questions with the simpler models like regressions and clusterings is always how to operationalise the models so that the rest of the organisation could consume the models by calling webservices. Luckily there exists model export functionality that takes us almost there. It exports Spark MLlib models so that they can be run anywhere where there is a JVM available ... no Spark needed.

So I decided to build the missing half : wrap the model inside a simple java web app and deploy to Azure to be consumed.

Some theory

First read this article to properly understand the model export feature:https://docs.databricks.com/spark/latest/mllib/model-export.html

Then you might want to take a look at some sample code on instancing the exported models and how to use them for predicting stuff:
https://github.com/databricks/databricks-ml-examples/tree/master/model-export-demo

Finally you should download my sample IntelliJ project and use it as base for you own operationalising :
https://petsablob.blob.core.windows.net/share/siirto/modeltestwebapp.zip

Important bits

The solution itself is ridiculously simple and I'll go throug it here for your pleasure.
I set up a basic maven archetype web app project with IntelliJ Idea and added one utility class and one library (read the docs).

This is my index.jsp-file that does the interfacing with the caller.

Here we basically set up the parameter block structure that goes into the model and take values from the web request that was sent to us. The request could be something like this: https://YOURSERVER.azurewebsites.net/?topic=sci.space&text=space%20the%20final%20frontier
Then the values are handed over to our utility class to be reacted upon and the result is returned to the caller ... not too complex , eh ?.

The utility class is even simpler ...
It knows how azure treats java apps and how you can refer to files contained in the war-package (our model files, that is).
Knowing all this it creates a static instance of the model and starts churning out predictions when asked to do so.

I then used Microsofts excellent IntelliJ Azure plugin to deploy this project to ... you guessed it ... to Azure.

Summary

The funny thing is that done this way your service is automagically enjoying the benefits of Azure Web App Autoscaling and Azure Authentication (Easy Auth, and yes, it works with Java too) making this a real production ready enterprise solution ... with embarrassingly few lines of code .

New AI Services in Azure for students and academics announced at Build 2018

$
0
0

image

Azure Cognitive Services

1.Object Detection update to custom vision (preview) http://aka.ms/cognitive

2.Video Indexer (Paid Preview) https://azure.microsoft.com/en-us/blog/build-2018-video-indexer-updates/

3.New OCR model (preview) https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/

4.Speech Recognition with customization http://aka.ms/cognitive

5.Speech Synthesis with customizable voice https://cris.ai/Home/CustomVoice

6.QnA Maker (Generally Available) https://blog.botframework.com/2018/05/07/announcing-general-availability-of-qnamaker/

7.Bing Services updates https://blogs.bing.com/Developers-Blog/2018-05/six-new-and-improved-bing-ai-offerings-on-microsoft-cognitive-services

8. Cognitive Search (public preview) https://www.ailab.microsoft.com/experiments/7d6b0652-51dc-440d-a12a-481f28525143

9. Cognitive Services Labs https://labs.cognitive.microsoft.com/

Conversational AI

1.Bot Builder SDK v4 (preview) Bot Builder homepage or the Bot Builder GitHub https://docs.microsoft.com/en-us/azure/bot-service/dotnet/bot-builder-dotnet-overview?view=azure-bot-service-3.0

2.Improved Bot Framework Emulator (preview) https://github.com/microsoft/botframework-emulator

3.New Bot Builder Tools https://github.com/Microsoft/botbuilder-tools

4.Bot Service Features – authentication now available https://docs.microsoft.com/en-us/azure/bot-service/

5.QnA Maker (GA) https://blog.botframework.com/2018/05/07/announcing-general-availability-of-qnamaker/

6.Project Conversation Learner https://labs.cognitive.microsoft.com/en-us/project-conversation-learner

7. Project Personality Chat https://labs.cognitive.microsoft.com/en-us/project-personality-chat

Custom AI

1. Azure ML Packages Preview – http://aka.ms/aml-packages

2. Azure ML and Project Brainwave – http://aka.ms/aml-real-time-ai

3. ONNX Model Gallery http://gallery.azure.ai/models

4. ML.NET Preview http://github.com/dotnet/machinelearning

5. Vision AI Developer Kit http://github.com/azure/ai-toolkit-iot-edge https://visionaidevkit.com

Azure for Students

Get Access to Azure and these service is super simple see http://aka.ms/azure4students

Build on the Microsoft 365 Platform

$
0
0

There are multiple aspects that would be of interest to developers in the Microsoft 365 platform as has been expounded during Microsoft Build 2018

The list of key sessions delivered for developing on the Microsoft 365 platform are at Build-2018-Updates

Apart from the keynotes there are about 45+ sessions listed on topics like Microsoft Graph, Fluent Design System, Adaptive Cards, Mixed Reality and Teams

For a summary on this topic you could also check out the article here


Performance Degradation in West US 2 – 05/16 – Investigating

$
0
0

Initial Update: Wednesday, May 16th 2018 17:54 UTC

We're currently investigating Performance Degradation in West US 2 region. Users hosted in West US 2 region may experience slowness while accessing their Visual Studio Team Services Accounts.

  • Next Update: Before Wednesday, May 16th 2018 18:25 UTC

Sincerely,
Krishna Kishore

Premium Messaging: High Performance Without Partitions

$
0
0

When Premium Messaging became generally available almost two years ago, by default when creating entities for your namespaces you got two partitions. This was a relatively new concept for some Service Bus customers that were used to one partition. Our reasoning at the time was to improve availability for a nascent service. Well the team has come a long way in two years to improve the performance and availability of Premium so much so that we no longer enable partitions by default, in fact we have disabled the ability to use partitions in new Premium instances. Partitions are no longer needed for availability and performance reasons for Premium Messaging, but don't worry if you continue to use existing partitioned entities with your Premium namespaces that is totally okay and won't change.

To summarize some important aspects of this:

  • Existing entities with partitions in Premium will not be changed, they will remain the same
  • New entities that are created with SDKs, CLI, PowerShell, ARM, Azure portal will be non-partitioned entities by default (without the ability to enable partitions)
  • There is no decrease in availability due to one partition, it will be just as good as partitioned entities
  • DO NOT set enable partition with ARM templates, by default Service Bus will not allow this
  • For Service Bus Standard Messaging customers we still encourage partitioning your entities

Additional Benefits

  • A big benefit is that you no longer have to set partition keys for features like sessions, transactions, send via, and duplicate detection
  • If you use SignalR with Service Bus as the backbone it will now work seamlessly with Premium Messaging as SignalR only supports non-partition entities

Happy Messaging!

Improving the responsiveness of critical scenarios by updating auto load behavior for extensions

$
0
0

The Visual Studio team partners with extension authors to provide a productive development environment for users, who rely on a rich ecosystem of quality extensions. Today, we're introducing an update to extension auto load based on feedback from our community of developers, who need to quickly start Visual Studio and load their solution while deferring other functionality to load in the background.

As part of ongoing performance efforts to guarantee a faster startup and solution load experience for all users, Visual Studio will change how auto loaded packages work during startup and solution load scenarios. Please see the upcoming changes for extension authors below and let us know if you have any questions. The team is actively answering any questions you might have regarding this on the ExtendVS channel on Gitter.

Upcoming changes:

In Visual Studio 2015, we added support for asynchronous packages (AsyncPackage base class) and asynchronous auto load. Extensions have been opting into asynchronous load to reduce performance issues since then. However, there are still some extensions that are loading synchronously, and it is negatively impacting the performance of Visual Studio.

In light of that, changes are coming to start the process of turning off synchronous auto load support. This will improve the user experience and guarantees a consistent startup and solution load experience, providing a responsive IDE. As part of this, changes to auto load behavior in a future Visual Studio update will be as follows:

  1. Async packages that load on the background have smaller performance impact than synchronously loaded packages, but still the cost is non-zero due to IO contention with foreground thread when starting up Visual Studio or opening a solution. To optimize startup and solution scenarios specifically, the IDE will not auto load async packages during those scenarios even on background threads. Instead the IDE will push all auto load requests into a queue. Once startup or solution load is completed, the IDE will start loading queued packages asynchronously as it detects pauses in user activity. This could mean that a package is never automatically loaded in that session if it’s a short session, or that packages which were queued to be loaded during startup might not load before a user opens a solution.
    Please note that this covers all auto load requests regardless of the source UI context. For example, synchronous auto load requests from any UI context (e.g. SolutionHasSingleProject) or rule-based UI contexts that were previously activated while a solution was being loaded will not be added to the queue. Other sources of package loads, such as project factory queries and service queries, will not be impacted by this change.
  2. All packages that utilize auto load rules will have to support background load and implement asynchronous initialization. The IDE will no longer synchronously auto load packages in any UI context, including rule-based UI contexts.

While asynchronous load support was added in Visual Studio 2015, we know many extensions also want to support Visual Studio 2013 in a single package. In order to make that possible, we have provided a sample that shows how to create a Visual Studio package that loads synchronously in Visual Studio 2013 but also supports asynchronous load in Visual Studio 2015 and above.

Timing:

The Visual Studio team is committed to working with extension owners to help make these changes as soon as possible and with as little disruption as possible for end-users. So, the changes will be phased in over multiple updates:

Visual Studio 2017, version 15.7:

  • The Visual Studio Marketplace is posting a reminder during submission of a non-compliant extension (i.e., an extension that auto-loads but is not an async-package that supports background load).
  • The Visual Studio SDK includes a new analyzer that will issue a build reminder for non-compliant extensions.

Visual Studio 2017, version 15.8:

  • Async packages that support background load will be loaded after Visual Studio startup and solution load are completed (this is update #1 mentioned above).

In a later update, Visual Studio will completely disable auto-loading of synchronous extensions (update #2 mentioned above). End users will see a notification in Visual Studio informing them about extensions that were impacted.

Impact on package implementations:

These changes may require changes to existing packages that utilize the ProvideAutoLoad attribute and inherit from the Package base class, including but not limited to:

  • Synchronous packages (those inheriting from the Package base class) must be converted to support asynchronous loading and enable background load. We also encourage package owners to move initialization code to the thread pool as much as possible to ensure users continue to see a responsive IDE. We will be monitoring extensions and UI delays to track responsiveness issues caused by auto loaded packages. You can find more information on diagnosing package auto load performance in our guidance on Microsoft Docs.
    In order to catch potential issues with async conversion, we encourage all package owners to install the latest SDK and Threading analyzers from NuGet in to their projects.
  • If your package needs to utilize the main thread because it calls into UI thread bound Visual Studio APIs that take a long time to execute, please let us know as we are looking for opportunities in converting such services to implement async methods or be free threaded to avoid responsiveness issues when loading packages in background.
  • Packages that used to load at the beginning of solution load and rely on solution events will need to change implementation as they will no longer receive such events. Instead the package can enumerate the contents of a solution when the extension is loaded. See code sample.
  • Like above, packages that used to load at startup and relied on solution events will have to handle the case where it is loaded after solution load is completed. It is possible for solution load to occur during and closely after startup giving IDE no chance to load startup packages (altering the load behavior of packages from previous versions of Visual Studio).
  • Packages that register command status handlers will need to ensure their default command states are valid. With these changes there will be a timeframe where the QueryStatus handlers are not registered. Generally, we encourage package owners to utilize rule-based UI contexts as much as possible to determine command states via metadata instead of code-based QueryStatus handlers, and will be looking for feedback in what additional terms can be added to rule-based UI contexts to move away from code-based handlers.

Testing async packages that auto load in the background:

Update#1 mentioned above will change the timing of when async packages auto-load in the background. To help you test your package with this behavior, Visual Studio 2017 versions 15.6 and 15.7 include the new auto load manager in the product behind a feature flag (in version 15.8 Preview 2 and later this will be enabled by default). With this feature flag enabled, Visual Studio will defer auto-loading of async, background loadable packages until startup and solution load complete and Visual Studio is idle for some time. Synchronous auto-loading packages will have no change in behavior.

To enable the new auto load behavior, you can run both the following commands in your Visual Studio installation directory:

    vsregedit set <VSROOT> HKCU FeatureFlagsShellAutoLoadRestrictions Value dword 1

    vsregedit set <VSROOT> HKLM AutoLoadPackages AllowSynchronousLoads dword 1

You can use the following command to change the idle time to a large value to aid in testing your extension. For instance, to set the idle time to 60 seconds:

    vsregedit set <VSROOT> HKLM AutoLoadPackages MinimumInputIdleTime dword 60000

To go back to existing behavior, you can run:

    vsregedit set <VSROOT> HKCU FeatureFlagsShellAutoLoadRestrictions Value dword 0

Resources:

Mads Kristensen, Senior Program Manager
@mkristensen

Mads Kristensen is a senior program manager on the Visual Studio Extensibility Team and has published over 100 free Visual Studio extensions over the course of the past 5 years.

The “P” in “PM”

$
0
0

In the following post, Premier Developer Consultant Ilias Jennane sheds more light on the PM role and how it fits into the bigger picture of DevOps.


In recent discussions with customers and colleagues about why large I.T. shops often struggle in adhering to Agile principles or becoming DevOps organizations, I couldn’t help but notice something extremely simple, however tremendously important.

Large I.T. shops manage Projects, while software shops build Products. One would think that it is the same thing, but the gap starts to happen there. On one hand, you will have product owners (or product managers) and the other you will have project managers. One is focused on the continuous delivery of value whereas the other is focused on the delivery of a known set of artifacts constrained by time, budget and scope.

Read more of his post here.

Email and publish Calendar in Outlook now gone…and how to get them back

$
0
0

Please note I am not on the Outlook team...I just use their product A LOT.

In the last Office update I noticed both Publish Calendar and Email Calendar functionality were missing from my and all the other Microsoft employees Outlook....Interestingly all the docs AND product UI references this functionality as if it is remains.

To Email Your Calendar

To get the email calendar back you must add a new Ribbon Group and add the email calendar command.

While it appears you can also add the publish calendar command, this doesn't command never seems to be enabled now.

To publish a calendar

Must admit i couldn't figure out a way to do this in Outlook and had to resort to OWA.

Under calendar settings there is a Calendar publishing option that will enable you to publish new calendars to the Internet

https://outlook.office.com/owa

 

 

Viewing all 29128 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>