Quantcast
Channel: MSDN Blogs
Viewing all 29128 articles
Browse latest View live

Agenda is Now Released for Office 365 DevDays!

$
0
0

Click here to look at the Office 365 DevDays agenda! You’ll be able to engage, interact, and develop with our Microsoft Office platforms during our three-day event.

This event is tailor-made for you if you are interested in any of the following activities:

  • Learning what Office 365 is and why Office as a platform service can help you successfully build powerful business solution integrating with the popular workloads such as Microsoft Exchange Online, SharePoint Online, Microsoft Teams and connecting to Millions of users and terabytes of data
  • Workshop training on Office 365 platform technologies including Microsoft Graphs, AI, Office add-ins
  • Participating in Hackathon projects
  • Discussing business or technical collaboration with Microsoft
  • Seeking technical help from Microsoft engineers on-site

 

You will be richly rewarded with training certification, hackathon prize, and most important of all, the deepened knowledge and skills to develop amazing Office 365 integrated business solutions.

The event will take place from November 4-6, 2017 at the Caohejing New Convention Center. Go to our Interop website for more event details.

Don’t forget to register today:

Please note that most sessions during this event will be conducted in Chinese.

Contact us with any questions at officedevdays@microsoft.com.

Social

         


One Year Back

$
0
0

one year with confetti

As of last week, I've been back at Microsoft for a year now. I knew Azure was a fast-moving product, but holy crap! It never stops. I remember when I started at Microsoft we spent something like 3+ years working on Visual Studio .NET & .NET. Nobody can afford to be that leisurely these days. Without the accompanying risk to life, it's like moving from Class I rapids to Class IV or V.

Here are a few things that I'm working on that you can find on a regular basis:

  • Azure Friday - Many of you may be familiar with Scott Hanselman's video series that he started back in 2013. It may surprise some people to know that Azure is not Scott's day job. I started helping Scott with Azure Friday almost a year ago to limit his burden to the time he spends in the studio. A byproduct of that is that we're able to produce and publish more episodes, which means we've outgrown the name. Perhaps you should think of it as Azure Fri-yay, or every day is Friday in Azure?
  • Last week in Azure - This is a new thing I started on the Azure blog to help current and prospective Azure customers keep up to speed with what's happening in Azure each week. To consume all the content that comes out each week is nearly a full-time job on its own. Ain't nobody got time for that.

I love feedback and I'm always looking for ways to improve, so please let me know how this content can better serve your needs. Also, let me know if there are particular topics you'd like to see covered on Azure Friday.

I'm also the Product Marketing Manager for Azure Cloud Shell, which has been a fantastic learning experience. I've been in marketing at Microsoft for almost a decade and this is the first time I owned part of the product. One thing I really enjoy about Cloud Shell is watching it proliferate: the Azure portal, the Azure mobile app, in the Azure documentation, etc. And there's more to come. And it's still early days for Cloud Shell with much more to come.

Here are just a few other things I've worked on or contributed to in the past year that you can find online:

  • Cloud Architecture Whiteboard Webinar Series - I interviewed members of the patterns & practices team on how to address common challenges that engineers and developers face when designing cloud-based solutions. Each webinar in the series will focus on a set of design patterns that address a fundamental design challenge.
  • Cloud Design Patterns Poster - This poster depicts common problems in designing cloud-hosted applications and design patterns that offer guidance. Each of the patterns covered in the webinar series is included here, but the poster is a superset of those and others.
  • Web for Containers explainer video - Some of the hardest content projects are when you have to be as concise as possible because you only have x words or y minutes. It's living the classic Pascal quote, "Je n'ai fait celle-ci plus longue que parce que je n'ai pas eu le loisir de la faire plus courte."
  • Everything else I've published on Channel 9 - There are things on there other than Azure Friday. Sometimes it's a minor post-production role, other times it's deeper involvement in the content itself. But as demonstrated in the webinar series above, I'm better off behind the camera instead of in front of it.

I'm looking forward to more Azure fun!

How to safeguard SQL Server on Linux from OOM-Killer

$
0
0

On a Windows based server, when all the available memory including the page file is consumed, the server's performance becomes sluggish and out of memory errors are logged to the event logs. On Linux systems, the behavior is slightly different. When the server is running low on memory, the Linux kernel will choose a process to be killed to restore smooth operation of the system. This mechanism on Linux is called OOM-Killer. More information at https://linux-mm.org/OOM_Killer

On servers running SQL Server, the killed process could very likely be SQL Server process as it is expected to have a larger memory foot print compared to other processes. In this blog, we shall review a customer scenario that highlights the need to perform additional configuration adjustments once SQL Server is installed on Linux.

SQL Server support team had the following customer scenario recently.

  • Customer was running SQL Server 2017 on Redhat enterprise Linux server.
  • The server had 12GB of RAM.
  • SQL Server was installed with the default configurations.
  • Some databases were part of Alwayson Availability group. The clustering layer is based on Red Hat Enterprise Linux (RHEL) HA add-on built on top of Pacemaker.

When an index rebuild was kicked off on a large table (around 25GB), the reindex operation terminated, and the availability group had failed over to the other replica.

Upon further investigation, we discovered that the SQL Server process terminated at the time reindex operation was run and this resulted in the failover.

To determine the reason for the unexpected shutdown, we reviewed the Linux System Logs (/var/log/messages on RHEL) & pacemaker logs. From the pacemaker logs and system logs, we saw entries indicating that oom-killer was invoked, and as a result SQL Server process was terminated.

Here are some relevant entries from the pacemaker log. Similar information can also be seen in /var/log/messages file.

Sep 13 16:17:30 l99s0004 kernel: [9264025.516359] sqlservr invoked oom-killer: gfp_mask=0x280da, order=0, oom_score_adj=0

 

Sep 13 16:17:30 l99s0004 kernel: [9264025.516555] 184007 total pagecache pages

Sep 13 16:17:30 l99s0004 kernel: [9264025.516556] 0 pages in swap cache

Sep 13 16:17:30 l99s0004 kernel: [9264025.516558] Swap cache stats: add 0, delete 0, find 0/0

Sep 13 16:17:30 l99s0004 kernel: [9264025.516558] Free swap  = 0kB

Sep 13 16:17:30 l99s0004 kernel: [9264025.516559] Total swap = 0kB

Sep 13 16:17:30 l99s0004 kernel: [9264025.516560] 3145598 pages RAM

 

Note: The highlighted entry indicates that no swap file was configured on the server.

 

The log also shows snapshot of memory consumption of all the processes on the system at the time oom-killer was invoked. The output is shown in a tabular format with only three processes below for better readability.

 

 

Pid Uid Tgid Total_vm Rss Nr_ptes Swapents Oom_score_adj name
46206 0 46206 26370 247 54 0 -1000 Sshd
33492 992 33492 49412 3805 46 0 0 Sqlservr
33495 992 33495 3029788 2418852 5059 0 0 Sqlservr

 

Note: Highlighted numbers are in 4K pages. Rss column (memory usage) for SQL Server process calculates to approximately 9448MB ((2418852*4)/1024).

 

Sep 13 16:17:30 l99s0004 kernel: [9264025.516658] Out of memory: Kill process 33495 (sqlservr) score 799 or sacrifice child

Sep 13 16:17:30 l99s0004 kernel: [9264025.516709] Killed process 33495 (sqlservr) total-vm:12119152kB, anon-rss:9675408kB, file-rss:0kB, shmem-rss:0kB

Sep 13 16:17:30 l99s0004 kernel: sqlservr invoked oom-killer: gfp_mask=0x280da, order=0, oom_score_adj=0

 

From the highlighted line, we can see that SQL Server has anon-rss value of 9675408kB.RSS stands for "resident set size" which is the amount of memory that is currently allocated in RAM for the process. file-rss is the amount of memory in the swap file which is 0KB for all the processes on this system.

When SQL Server starts, the amount of physical memory available to SQL Server is controlled by memory.memorylimitmb configuration option which by default is 80% of the physical memory. Based on this setting, the value 9675408kB (80% of 12GB) for SQL Server's anon-rss makes sense because the server has a total of 12GB of RAM.

This information can also be seen in SQL Server errorlog during the startup.

2017-09-13 16:23:17.62 Server Detected 9478 MB of RAM. This is an informational message; no user action is required.

Ref: https://docs.microsoft.com/en-us/sql/linux/sql-server-linux-configure-mssql-conf#memorylimit

Due to the default configuration settings, SQL Server could use around 9.5GB of 12GB RAM installed on the server. This leaves around 2.5GB of memory on the server to be used by Linux kernel and other processes running on the server. To begin with, SQL Server was only using a portion of 9.5GB leaving plenty of memory available on the system. However, when the index rebuild was executed, SQL Server used up all the 9.5GB of memory it can use. This condition had caused very little memory to be left on the Linux server, causing the oom-killer to be invoked. SQL Server was chosen as a victim as it has the highest memory usage(oom-score). This behavior is expected.

To make SQL Server less susceptible for termination by oom-killer, we recommend one or both of the following suggestions.

  1. Adjust memory.memorylimitmb configuration option carefully to leave enough memory on the system even if SQL Server were to use all of the memory configured through this setting.
  2. Ensure that swap file exists and sized properly.

In this customer scenario, we were able to avoid oom-killer by setting memory.memorylimitmb to 6GB OR by creating a swap file and setting it to appropriate size.

For other best practices and configuration guidelines for SQL Server 2017 on Linux, please refer to documentation at https://docs.microsoft.com/en-us/sql/linux/sql-server-linux-performance-best-practices .

 

Venu Cherukupalli

Senior Escalation Engineer

Microsoft SQL Server support.

Reviewed by: Suresh Kandoth, Denzil Rebeiro, Pradeep MM

 

Querying multiple tables by using joins in Transact-SQL

$
0
0

In Skill 1.2 from Exam Ref 70-761 Querying Data with Transact-SQL, learn how to query multiple tables by using join statements based on provided tables, data, and requirements.


Skill 1.2: Query multiple tables by using joins

Often, data that you need to query is spread across multiple tables. The tables are usually related through keys, such as a foreign key in one side and a primary key in the other. Then you can use joins to query the data from the different tables and match the rows that need to be related. This section covers the different types of joins that T-SQL supports: cross, inner, and outer.

This section covers how to:

  • Write queries with join statements based on provided tables, data, and requirements
  • Determine proper usage of INNER JOIN, LEFT/RIGHT/FULL OUTER JOIN, and CROSS JOIN
  • Construct multiple JOIN operators using AND and OR
  • Determine the correct results when presented with multi-table SELECT statements and source data
  • Write queries with NULLs on joins

Read this sample chapter.

.NET Framework October 2017 Preview of Quality Rollup

$
0
0

Today, we are releasing the October 2017 Preview of Quality Rollup. This type of rollup is intended for businesses that want to the preview or use quality improvements as soon as they are available.

Quality and Reliability

This release contains the following quality and reliability improvements.

CLR

  • Code optimization bug for x64 C# code targeting .NET Framework 4.6.1 and running on .NET Framework 4.7. [484415]

WPF

  • WPF touch stops working after many touch events due to reference counting issue. [460192]
  • WPF touch generates a NullReferenceException in System.Windows.Input.StylusWisp.WispLogic.ProcessInputReport with .NET Framework 4.7. [480909]
  • WPF crash caused by INVALID_POINTER_WRITE_c0000005_PenIMC_v0400.dll!CPimcContext::GetPenEventMultiple. [488390]

Note: Additional information on these improvements is not available. The VSTS bug number provided with each improvement is a unique ID that you can give Microsoft Customer Support, include in StackOverflow commentsor use in web searches.

Getting the Update

The Preview of Quality Rollup is available via Windows Update, Windows Server Update Services, Microsoft Update Catalog, and Docker.

Microsoft Update Catalog

You can get the update via the Microsoft Update Catalog. For Windows 10, .NET Framework updates are part of the Windows 10 Monthly Rollup.

Product Version Security and Quality Rollup KB
Windows 10 1607 (Anniversary Update) Catalog
4041688
.NET Framework 4.7 4041688
.NET Framework 4.6.2 4041688
.NET Framework 4.6.1 4041688
.NET Framework 4.6 4041688
Windows 8.1
Windows RT 8.1
Windows Server 2012 R2
Catalog
4042078
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7 4041777
.NET Framework 4.5, 4.5.1, 4.5.2 4040974
.NET Framework 3.5.1 4040981
Windows Server 2012 Catalog
4042077
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7 4041776
.NET Framework 4.5.2 4040975
.NET Framework 3.5 4040979
Windows 7
Windows Server 2008 R2
Catalog
4042076
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7 4041778
.NET Framework 4.5.2 4040977
.NET Framework 3.5.1 4040980
Windows Server 2008 Catalog
4042201
.NET Framework 4.6 4041778
.NET Framework 4.5.2 4040977
.NET Framework 2.0 4040978

Known Issues

Previous Monthly Rollups

The last few .NET Framework Monthly updates are listed below for your convenience:

User accounts made easy with Azure

$
0
0

One of the most common requirements for a web application is to have users create accounts, for the purpose of access control and personalization. While ASP.NET templates have always made it easy to create an application that uses a database you control to register and track user accounts, that introduces other complications over the long term. As laws around user information get stricter and security becomes more important, maintaining a database of users and passwords comes with an increasing set of maintenance and regulatory challenges.

A few weeks ago I tried out the new Azure Active Directory B2C service, and was really impressed with how easy it was to use. It added user identity and access control to my app, while moving all the responsibility for signing users up, authenticating them, and maintaining the account database to Azure (and it’s free to develop with).

In this post I’ll briefly walk through how to get up and running with Azure B2C in a new ASP.NET Core app. It’s worth noting it works just as well with ASP.NET apps on the .NET Framework with slightly different steps (see walkthrough). I’ll then include some resources that will help you with more complex scenarios including authenticating against a backend Web API.

Step 1: Create the B2C Tenant in Azure

  • To get started, you’ll need an Azure account. If you don’t have one yet, create your free account now
  • Create an Azure AD B2C Directory
  • Create your policies (this is where you indicate what you need to know about the user)
    • Create a sign-up or sign-in policy
      • Choose all of the information you want to know about the user under “Sign-up attributes”
      • Selected all the information you want passed to your application under “Application Claims” (note: the default template uses the “Display Name” attribute to address the user in the navigation bar when they are signed in so you will want to include that)
        clip_image002
    • Create a profile editing policy
    • Create a password reset policy
    • Note: After you create each policy, you’ll be taken back to the tab for that policy type which will show you the full name of the policy you just created, which will be of the form “B2C_1_<name_you_entered>”.  You’ll need these names below when you’re creating your project.
      image
  • Register your application (follow the instructions for a Web App)
    • Note: You’ll get the “Reply URL” in the next step when you create the new project.

Step 2: Create the Project in Visual Studio

  • File -> New Project -> Visual C# -> ASP.NET Core Web Application
    clip_image004
  • On the New ASP.NET dialog, click the “Change Authentication” button on the right side of the dialog
    image
    • Choose “Individual User Accounts”
    • Change the dropdown in the top right to “Connect to an existing user store in the cloud”
    • Fill in the required information from the B2C Tenant you created in the Azure portal previously
    • Copy the “Reply URI” from the “Change Authentication” dialog and enter it into the application properties for the app you previously created in your B2C tenant in the Azure portal.
    • Click OK
      clip_image006

Step 3: Try it out

Now run your application (ctrl+F5), and click “Sign in” in the top right:

clip_image008

You’ll be navigated to Azure’s B2C sign-in/sign-up page:

clip_image010

The first time, click the “Sign up now” at the bottom to create your account. Once your account is created, you’ll be redirected back to your app and you’re now signed in. It’s as easy that.

clip_image012

Additional Resources

The above walkthrough show a quick overview for how to get started with Azure B2C and ASP.NET Core. If you are interested in exploring further or using Azure B2C in a different context, here are a few resources that you may find useful:

  • Create an ASP.NET (.NET Framework) app with B2C
  • ASP.NET Core GitHub sample: This sample demonstrates how to use a web front end to authenticate, and then obtain a token to authenticate against a backend Web API.
  • If you are looking to add support to an existing app, you may find it easiest to create a new project in Visual Studio and copy and paste the relevant code into your existing application. You can of course use code from the GitHub samples mentioned above as well

Conclusion

Hopefully you found this short overview of Azure B2C interesting. Authentication is often much more complex than the simple scenario we covered here, and there is no single “one size fits all”, so it should be pointed out that there are many alternative options, including third-party and open source options. As always, feel free to let me know what you think in the comments section below, or via twitter.

Configure Azure Storage for SQL Database backups

$
0
0

SQL Server 2012+ and Azure SQL Managed Instance support native BACKUP commands that can backup a database to Azure Blob Storage URL. Setting-up Azure Storage account might not be so easy as you think because there are some constraints.

You can learn about backup to Azure storage here, but there are some hidden constraints that might hit you. In this post, I will show you how to properly setup Azure Blob Storage before you start backups.

Use Standard Classic storage account

BACKUP TO URL can backup database to Classic Standard storage account. If you choose Resource Manager model or premium storage type, you will probably get something like the following error when you start backup:

Msg 3201, Level 16, State 1, Line 7
Cannot open backup device 'https://managedinstance.blob.core.windows.net/backups/tpcc.bak'. Operating system error 50(The request is not supported.).
Msg 3013, Level 16, State 1, Line 7
BACKUP DATABASE is terminating abnormally.

Also, Connect to a Microsoft Azure Subscription will not work in a Blob Storage if it is in the "Resource Manager Deployment" model. Therefore, make sure that you choose the following settings when you create Azure Storage:

Credentials

Once you setup storage account and create container where you will place your backups, you need to create a CREDENTIAL in master database that will be used to backup database. Credential MUST have the same name as Azure Storage URL:

CREATE CREDENTIAL [https://managedinstance.blob.core.windows.net/backups]
 WITH IDENTITY='SHARED ACCESS SIGNATURE'
 , SECRET = N'sv=2017-04-17&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-10-18T02:51:12Z&st=2017-10-15T18:51:12Z&spr=https&sig=7lxfhiQNAt%2F%2Bo%3D'

 

One thing that will break this is value that you put in the SECRET! SECRET is SAS key of Storage account that you can find on a portal. You can generate new SAS key if you go to Shared Access Signature blade and generate SAS:

Note another interesting thing - UTC time in drop down. Azure portal will place your current time in Start Time, but it will leave UTC time zone instead of your timezone. If you are not in UTC timezone and you don't notice that this UTC time is actually in future, you might waist a few hours debugging why you cannot access Storage with properly generated token.

Finally, when you press Generate SAS you will get the valid SAS key that you can copy in your CREATE CREDENTIAL command. However, on the portal is shown value that is starting with "?". If you just copy paste this string you backup will break, because SQL Database Engine expects a value without? Make sure that you remove ? from SAS key if you copy this value.

Another option is to use SSMS to access Blob Storage and let SSMS to create CREDENTIAL.

Conclusion

These are probably three thing that can cost you a few hours of debugging if you are not creating Azure Storage keys every day. Make sure that you check type of storage, validity period, and ? in SAS key when you configure Azure Storage, or your backups will fail.

Jump Start ASP.NET Core Application on Nano Server

$
0
0

This post is provided by Senior App Dev Manager, Linkai Yu who shares some tips to speed up ASP.NET Core Application dev on Nano Server.


With the Nano server release from Windows 2016 Server, I’m excited to see the new application deployment practice to work with the thin and fast OS.

If you are new to ASP.NET Core application deployment on Nano server, I hope this blog will help you get up and running quickly while avoiding wasting time on the old documentation.

If you search “asp.net core app on nano server”, you will find some older documentation such as:

ASP.NET Core on Nano Server | Microsoft Docs

Running Asp.Net Core with IIS on Nano Server | Microsoft Docs

ASP.NET Core on Nano Server — ASP.NET documentation

These documents are all based on the traditional architecture where you create a Nano OS image, configure it for ASP.Net Core and IIS, then boot up the OS and the app runs. The steps are long and are error prone.

There is a new approach and a better way to do this.  If you search “visual studio 2017 docker tutorial”, you will find articles such as:

.NET Docker Development with Visual Studio 2017

Using .NET and Docker Together | .NET Blog

With the new Visual Studio support for containers, things become much easier. With Visual Studio 2017, if you install Docker for Windows on your development machine (currently only Windows 10 and Windows Server 2016 are supported) you can build an ASP.NET Core app and enable the Docker support in the project. When you build and run it in Docker, Docker will pull down the Nano server image that has ASP.Net Core on it to your local machine and run your app. Basically you don’t have to deal with OS anymore. If you have a regular ASP.NET application, you can do the same thing. The difference is that Visual Studio will configure the dockerfile to pull a Windows Core OS image that has ASP.NET and .NET framework installed.

To provide you with a simple intuitive guideline, there are the steps:

  1. Install Docker for Windows. In the Docker settings, select Windows Containers if you want to run Nano server.
  2. In Visual Studio 2017, create an Asp.Net Core web project, (either select Docker support when creating the project or Add Docker support after the project is created)
  3. Build and Run Docker (Docker replaces the Debug button in Visual Studio)
  4. You can also build the Release version and then you can run it outside the Visual Studio, in a regular CMD console (e.g. c:>docker run myNetCoreApp)

Premier Support for Developers provides strategic technology guidance, critical support coverage, and a range of essential services to help teams optimize development lifecycles and improve software quality.  Contact your Application Development Manager (ADM) or email us to learn more about what we can do for you.


Announcing the .NET Framework 4.7.1

$
0
0

Today, we are announcing the release of the .NET Framework 4.7.1. It’s included in the Windows 10 Fall Creators Update. .NET Framework 4.7.1 is also available on Windows 7+ and Windows Server 2008 R2+.  We’ve added support for targeting the .NET Framework 4.7.1 in Visual Studio 2017 15.5.

The .NET Framework 4.7.1 includes improvements in several areas:

  • Accessibility improvements in narration, high contrast and focus control areas
  • .NET Framework support for .NET Standard 2.0 and compiler features
  • More secure SHA-2 support in ASP.NET and System.Messaging
  • Configuration builders
  • ASP.NET Execution step feature
  • ASP.NET HttpCookie parsing
  • Enhancements in Visual Tree for WPF applications
  • Performance and reliability improvements

You can download the .NET Framework 4.7.1

For building applications targeting .NET 4.7.1 download the Developer Pack. You can see the complete list of improvements in the .NET Framework 4.7.1 release notes. .NET Framework 4.7.1 reference sources are available on the GitHub .NET Reference source read-only repository. .NET Framework 4.7.1 will be available on Windows Update in the near future. Docker images will be made available for this release and we will update this post when available.

Supported Windows Versions

The .NET Framework 4.7.1 is supported on the following Windows versions:

  • Windows 10 Fall Creators Update (included in-box)
  • Windows 10 Creators Update
  • Windows 10 Anniversary Update
  • Windows 8.1
  • Windows 7 SP1

The .NET Framework 4.7.1 is supported on the following Windows Server versions:

  • Windows Server 2016
  • Windows Server 2012 R2
  • Windows Server 2012
  • Windows Server 2008 R2 SP1

BCL - .NET Standard 2.0 Support

.NET Framework 4.7.1 has built-in support for .NET Standard 2.0. .NET Framework 4.7.1 adds about 200 missing APIs that were part of .NET Standard 2.0 but not actually implemented by .NET Framework 4.6.1, 4.6.2 or 4.7. You can refer to details on .NET Standard on .NET Standard Microsoft docs.

Applications that target .NET Framework 4.6.1 through 4.7 must deploy additional .NET Standard 2.0 support files in order to consume .NET Standard 2.0 libraries. This situation occurred because the .NET Standard 2.0 spec was finalized after .NET Framework 4.6.1 was released. .NET Framework 4.7.1 is the first .NET Framework release after .NET Standard 2.0, enabling us to provide comprehensive .NET Standard 2.0 support. 

Experience in .NET Framework 4.6.1 through 4.7
  • If you use Visual Studio 2017 15.3 or higher, the .NET Standard 2.0 support files are automatically copied to the application's output folder.
  • If you use Visual Studio 2015 and use NuGet 3.6, we will prompt you to install a support package which will handle copying the support files to the output directory.
Experience in .NET Framework 4.7.1
  • These support files no longer have to be deployed with the application - they are built right into the .NET Framework itself.
  • This also removes the need for binding redirects when using .NET Standard libraries on .NET Framework because the CLR automatically unifies version numbers of assemblies that are part of the platform.

Runtime - GC Performance Improvements

.NET Framework 4.7.1 brings in changes in Garbage Collection (GC) to improve the allocation performance, especially for Large Object Heap (LOH) allocations. This is due to an architectural change to split the heap’s allocation lock into 2, for Small Object Heap (SOH) and LOH. Applications that make a lot of LOH allocations, should see a reduction in allocation lock contention, and see better performance. These improvements allow LOH allocations while Background GC (BGC) is sweeping SOH. Usually the LOH allocator waits for the whole duration of the BGC sweep process before it can satisfy requests to allocate memory. This can hinder performance. You can observe this problem in PerfView's GCStats where there is an 'LOH allocation pause (due to background GC) > 200 msec Events' table. The pause reason is 'Waiting for BGC to thread free lists'. This feature should help mitigate this problem.

ASP.NET Forms Authentication Credentials

ASP.NET has always allowed developers to store user credentials with hashed passwords in configuration files. Previously, the available hash algorithms for this feature were MD5 or SHA-1. Now new secure SHA-2 hash options like SHA-256, SHA-384 and SHA-512 are added in .NET Framework 4.7.1. SHA-1 is still the default to preserve compatibility.

Refer to the following sample to leverage this new feature.

SHA-2 support for Message.HashAlgorithm

In previous versions, if application code specified a System.Messaging HashAlgorithm value, it was limited to MD5 and SHA-1. With .NET Framework 4.7.1, there is support for HashAlgorithm values for SHA-256, SHA-384, and SHA-512 added to System.Messaging Message.HashAlgorithm. The actual usage of these values is in MSMQ as MSMQ makes the “default” decision and these values are simply passed down to MSMQ. System.Messaging does not do any hashing with these values.  Following snippet illustrates how you can enable hashing on a queue and create a message with these new values.

Configuration builders

Configuration builders allow developers to inject and build configuration for applications at runtime, allowing configuration data to be pulled from sources beyond the traditional .config file.  In previous versions of the .NET Framework, configuration has been static. Applications only draw configuration data from a limited chain of .config files. With Configuration Builders, applications can apply a custom-defined set of builders to any section of config. These builders are free to modify the configuration data contained in the given config section, or build it entirely from scratch - possibly drawing new data from new sources that are not static files.
To use the Configuration Builders feature, developers simply need to declare builders in config, then apply them to configuration sections with the ConfigBuilders tag.
To implement custom Configuration Builder, developers can inherit from the System.Configuration.ConfigurationBuilder base class.

Here are some code samples that will enable you to declare, use and apply configuration builders.

ASP.NET Execution Step Feature

ASP.NET processes requests in its predefined pipeline which includes 23 events. ASP.NET executes each event handler as an execution step. With this new ExecutionStepInvoker feature, developers will be able to run this execution step inside their code.
Today ASP.NET can’t flow the execution context due to switching between native threads and managed threads. ASP.NET selectively flows only the HttpContext which may not be sufficient for ambient context scenarios. With this feature we enable modules to restore ambient data. The ExecutionStepInvoker is intended for libraries that care about the execution flow of the application (tracing, profiling, diagnostics, transactions, etc.).

We have added a new API to enable this: HttpApplication.OnExecuteRequestStep(Action<HttpContextBase, Action> callback)

Check the following sample to leverage this new feature.

ASP.NET HttpCookie parsing

It can be challenging parsing HttpCookie Set-Cookie/Cookie headers to read and write cookie properties from HTTP Headers. Now, we have provided support for a new API that allows for a standardized way to create an HttpCookie object from a string and accurately capture properties of the cookie like expiration date, path, the secure indicator. Furthermore, it assigns cookie value(s) appropriately. This new ASP.NET API for parsing HttpCookie from Set-Cookie/Cookie headers reads as follows: static bool HttpCookie.TryParse(string s, out HttpCookie result)
The following sample illustrates the usage of this new API.

Compiler - ValueTuple is Serializable

The System.ValueTuple types in .NET Framework 4.7.1 are now marked as Serializable, which allows binary serialization as shown in the example below. Since the syntax for C# 7.0 and VB 15.5 tuple types, for example, (int, string) relies on System.ValueTuple, this should make migrating from System.Tuple to using the new tuple syntax easier.

Compiler - Support for ReadOnlyReferences

.NET Framework 4.7.1 adds initial support for the ReadOnlyReferences C# 7.2 language feature, which  is coming in a future Visual Studio 2017 Update. .NET Framework 4.7.1 introduces the IsReadOnlyAttribute for ReadOnlyReferences feature. This attribute will be used by the compiler to mark members that have readonly-ref return types or parameters. If the compiler is running against an older .NET Framework version, it will generate this attribute and embed it's definition in the compiled assembly. The following example illustrates C# 7.2 code that can make use of this attribute.

Compiler - Support for Runtime Feature Detection

This new API provides a way to detect whether a particular runtime supports a certain feature or not. At compile time the API provides a way to do that statically through reflection. Whenever the compiler needs to check for runtime support, it would look for the corresponding well-known enum member, for instance, System.Runtime.CompilerServices.RuntimeCapabilities.SupportsDefaultImplementation. If the member exists, then the feature check is successful or the feature is supported. The value for that enum member is ignored.

At runtime the check for feature support is done by calling a static method. This is enabled by the addition of the framework type RuntimeFeature. Tools can query it by calling the static method bool IsSupported(string) to check whether the feature is supported or not, by passing in the string name for a given feature. For example, RuntimeFeature.IsSupported("FixedGenericAttributes").

Following example illustrates C# 7.2 code that can make use of this attribute.

Runtime - Support for Portable PDBs

This feature adds support for Portable PDBs in the .NET Framework. Libraries that generate code at runtime, like C# Scripting, would benefit from being able to detect whether the runtime supports Portable PDBs or not. This is because they could emit Portable PDBs instead of Windows PDBs. Emitting Portable PDBs has performance benefits; it is faster and has much smaller memory footprint. In absence of this new API the library would need to resort to hard-coding build numbers of the mscorlib or conservatively assume that .NET Framework doesn't support Portable PDBs. In addition RuntimeFeature.IsSupported method would be changed to return true if 'PortablePdb' is passed to it. Following sample illustrates how this can be passed.

Accessibility improvements

.NET Framework 4.7.1 brings in a lot of accessibility improvements across different libraries to align with the broad Microsoft product accessibility goals.

Enabling the Accessibility Improvements

In order for the application to benefit from these changes, it needs to run on the .NET Framework 4.7.1 or later and configured in one of the following ways:

  • It is recompiled to target the .NET Framework 4.7.1. OR
  • It opts out of the legacy accessibility behaviors by adding the following AppContext Switch to the <runtime> section of the app config file and setting it to false, as the following example shows.

Applications that target the .NET Framework 4.7.1 or later and want to preserve the legacy accessibility behavior can opt in to the use of legacy accessibility features by explicitly setting this AppContext switch to "true". Detailed information on all the Accessibility changes are provided in the .NET Framework 4.7.1 Application Compatibility documentation.

Windows Forms Accessibility improvements

Windows Forms accessibility changes are in the following areas:

  • Improved display during High Contrast mode
  • Enhanced UI accessibility patterns
  • Improved UI Accessibility properties with the outcome of improved experiences in accessibility tools like Narrator

High Contrast Improvements

Various controls in WinForms are now improved in the way they render under the various HighContrast modes available in the Operating System (OS). Windows 10 has changed the values for some high contrast system colors and Windows Forms is based on the Windows 10 Win32 framework. For the best experience, run on the latest version of Windows and opt in to the latest OS changes by adding an app.manifest file in a test application and un-comment the Windows 10 supportedOS  line so that it looks the following example:

    <!-- Windows 10 -->
<supportedOS Id="{8e0f7a12-bfb3-4fe8-b9a5-48fd50a15a9a}" />

Some examples of High Contrast changes are as follows:

  • Check marks in MenuStrip items are now easier to view
  • Disabled MenuStrip Items when selected are now easier to view
  • Text in a selected button is now contrasting enough with the selection color
  • Disabled text is now easier to read

Before: 

After: 

Improved Narrator Support

You can observe the following accessibility improvements in the Narrator area after you opt-in to the Accessibility improvements in .NET Framework 4.7.1.

  • The MonthCalendar control’s value is now read by the Narrator.
  • The CheckedListBox now notifies Narrator when the CheckedState property has changed so the user is notified that they’ve changed the value of a list item.
  • DataGridViewCell now reports the correct read-only status to Narrator.
  • Narrator can now read Disabled ToolStripMenuItem text when previously it would skip over disabled menu items.

UI Accessibility Patterns

Developers of accessibility technology tools will now be able to leverage common UI Accessibility patterns and properties for several WinForms controls. These improvements include:

WPF Accessibility improvements

Accessibility improvements in WPF are in the following areas:

  • UIAutomation LiveRegion support
  • Screen Readers
  • High Contrast

UIAutomation LiveRegion Support

Screen readers such as Narrator help people read the UI contents of an application, usually by text-to-speech output of the UI content that’s currently focused. However, if a UI element changes somewhere in the screen and it is not being focused at that point in time, the user may not be notified, and so they may be missing important information.

LiveRegions are meant to solve this problem. A developer can use them to inform the screen reader, or any other UIAutomation client, that an important change has been made to a UI element. The screen reader can then make decisions of its own as to how and when to inform the user of this change. The LiveSetting property also informs the screen reader of the importance of the UI change to the user.

LiveSettingProperty and LiveRegionChangedEvent have been added to System.Windows.Automation.AutomationElementIdentifiers, settable via XAML.

A new DependencyProperty is now registered for “LiveSetting” under System.Windows.Automation.AutomationProperties, as well as Set and Get methods. System.Windows.Automation.Peers.AutomationPeer now has a new method GetLiveSettingCore, which can be overridden to provide a LiveSetting value.

A new enumeration for the possible values of LiveSetting has been added to System.Windows.Automation.

How to make a LiveRegion?

You can set the AutomationProperties.LiveSetting property on the element of interest to make it a “LiveRegion” as shown in the following sample.

Announcing an important UI change

When the data changes on your LiveRegion, and you feel the need to inform a screen reader about that change, you need to explicitly raise an event as illustrated by the following sample.

Screen reader

You can observe the following accessibility improvements in the screen reader area after you opt-in to the Accessibility improvements in .NET Framework 4.7.1.

  • In previous versions, Expanders were announced by screen readers as buttons, they are now correctly announced as groups (expand/collapse).
  • In previous releases, DataGridCells were announced by screen readers as “custom”, they are now correctly announced as data grid cell (localized).
  • Now Screen readers will announce the name of an editable ComboBox.
  • In previous releases, PasswordBoxes were announced as “no item in view” or had otherwise incorrect behavior, this issue is now fixed.

High Contrast

There are High Contrast improvements in various WPF controls and they are visible when High Contrast theme is set.

Expander control

The focus visual for the expander control is now visible. The keyboard visuals for combo-box, list-box and radio buttons are visible as well.

Before: 

After: 

CheckBox and RadioButton

The text in CheckBox and RadioButton is now easier to see when selected in high contrast themes.

Before: 

After: 

ComboBox

The border of a disabled ComboBox is now the same color as disabled text.

Before: 

After:   

Disabled and focused buttons use the correct theme color.

Before: 

After: 

Setting a ComboBox’s style to Toolbar, ComboBoxStyleKey caused the dropdown arrow to be invisible, this issue is now fixed.

Before: 

After: 

DataGrid

The sort indicator arrow in DataGrid now uses correct theme colors.

Before: 

After:   

Previously, default link style changed to incorrect color on mouse over in high contrast modes and this is now resolved. Similarly, DataGrid checkbox column now uses the expected colors for keyboard focus feedback.

Before: 

After:    

WCF SDK Tools Accessibility Improvements

.NET Framework 4.7.1 SDK tools - SvcConfigEditor.exe and SvcTraceViewer.exe have improved accessibility in the following areas:

  • Screen Readers
  • High Contrast
  • Keyboard focus order and keyboard navigation

One of the key improvements in SvcConfigEditor.exe is the new Diagnostics screen.  In previous versions of SvcConfigEditor, the Performance Counter toggle link had no way of displaying which options where available.  It was unclear how to enable and/or disable features, and keyboard navigation was limited and unpredictable. Since most of the GUI design was based on labels without action controls, screen readers failed to read and highlight the items correctly, and labels with colors not compatible with different high-contrast settings where abundantly used.

Before:  

After:  

Screen Readers

You can observe the following accessibility improvements in the screen reader area in .NET Framework 4.7.1 SDK.

  • In previous version of SvcConfigEditor.exe, screen readers don’t read when tab to ‘Configuration’/ ‘Services’, they are now correctly announced as ‘Configuration’ and ‘Services’.
  • In previous version of SvcConfigEditor.exe, screen readers don’t read when tab to ‘Address: XXX’/ ‘Binding: XXX’/ ’Contact: XXX’, they are now correctly announced as ‘Address: XXX’/ ‘Binding: XXX’/ ’Contact: XXX’.

High Contrast

WCF SDK tools have improved varied controls where they are now more visible when High Contrast theme is set. You can refer to the following example of high contrast improvement in SvcConfigEditor.exe. There are many other similar improvements.

Before:  

After:  

Keyboard Focus Order and Keyboard Navigation

In .NET Framework 4.7.1 WCF SDK tools have improved UI keyboard focus order to make it more logical for keyboard access, and improved some controls to be keyboard accessible. You can refer to the following examples:

  • In previous version of SvcConfigEditor.exe, focus order in "Edit WCF Configuration" window is inappropriate, they are now in more logical focus order.
  • In previous version of SvcTraceViewer.exe, when you try to navigate to tool bar menu items using keyboard, the tool bar items were not accessible using keyboard, they are accessible now.
  • In previous version of SvcTraceViewer.exe, in Graph->Formatted tab, Options items are not keyboard accessible. It now can be accessed via keyboard by arrow down button.

WPF - Changing implicit data templates

This feature enables the automatic update of elements that use implicit DataTemplates after changing a resource. When an application adds, removes, or replaces a value declared in a ResourceDictionary, WPF automatically updates all elements that use the value in most cases, including the implicit style case: <Style TargetType="Button". Here the value should apply to all buttons in the scope of the resource. This feature supports a similar update in the implicit data template case where the value should apply to all in-scope ContentPresenters whose content is a Book: <DataTemplate DataType="{x:Type local:Book}"> 

This feature's principal client is Visual Studio's "Edit-and-Continue" facility, when a user changes a DataTemplate resource in a running application and expects to see the effect of that change when the application continues. However it could also prove useful to any application with changing DataTemplate resources.

The feature is controlled by a new property ResourceDictionary.InvalidatesImplicitDataTemplateResources. After setting this to True, any changes  to DataTemplate resources in the dictionary will cause all ContentPresenters in the scope of the dictionary to re-evaluate their choice of DataTemplate. This is a moderately expensive process - our recommendation is to not to enable it unless you really need it.

WPF - Distinguishing dynamic values in a template

This feature enables a caller to determine whether a value obtained from a template is "dynamic". Diagnostic assistants, such as Visual Studio's "Edit-and-Continue" facility, need to know whether a templated value is dynamic, in order to propagate a user's changes correctly.

The feature is implemented by a new method on the class DependencyPropertyHelper:

This returns true if the template's value for the given property is "dynamic", that is if it declared via DynamicResourceReference or TemplateBinding, or via Binding or one of its derived classes.

WPF - SourceInfo for elements in templates

Diagnostic assistants such as Visual Studio's "Edit-and-Continue" facility can use SourceInfo to locate the file and line number where a given element was declared.  The SourceInfo is now available for elements declared in a template loaded from XAML (as opposed to compiled BAML). This enables diagnostic assistants to do a better job. This feature is enabled automatically whenever SourceInfo itself is enabled.

WPF - Enable Visual Diagnostics

This feature provides a number of ways to control the VisualDiagnostics features. Diagnostic assistants can request WPF to share internal information. This feature gives both the assistant and the application developer more control over when this sharing is enabled.
The VisualDiagnostic features in WPF, with their introduction in .NET Framework 4.6, were initially only enabled when a managed debugger was attached. However, scenarios have arisen involving other components (besides a debugger) that can reasonably be considered as a diagnostic assistant, e.g. Visual Studio's design surface. Thus, the need for a public way to control the features.  The feature is controlled by two new methods on the class VisualDiagnostics, and by a number of registry keys, app-context switches, and environment variables.

The methods enable and disable the VisualTreeChanged event. You can only enable this event in a "diagnostic scenario", defined as one of the following:

  • A debugger is attached
  • Windows 10 Developer Mode is set. More precisely, registry key HKLMSOFTWAREMicrosoftWindowsCurrentVersionAppModelUnlockAllowDevelopmentWithoutDevLicense has value 1
  • Environment variable ENABLE_XAML_DIAGNOSTICS_VISUAL_TREE_NOTIFICATIONS is set to a value different from "0" or "false" (case-insensitive).

Changes to the visual tree are disallowed while a VisualTreeChanged event is in progress. Specifically, an InvalidOperationException is thrown by any of the following actions:

  • Changing a visual or logical parent
  • Changing a resource dictionary
  • Changing a DependencyProperty value on a FrameworkElement or FrameworkContentElement.

This guards against unexpected and unsupported re-entrancy.

It is possible to override this InvalidOperationException, should you encounter a situation where debugging is impeded by it. To do so, add the following AppContext Switch to the <runtime> section of the app config file and set it to true,

Switch.System.Windows.Diagnostics.AllowChangesDuringVisualTreeChanged

None of the features mentioned here are supported in production applications. They are intended only for diagnostic assistance.

Finally, you may want to run your application under the debugger, but in "production mode" without any potential interference from the VisualDiagnostic features. To do so, add the following AppContext Switch to the <runtime> section of the app config file and set it to true,

Switch.System.Windows.Diagnostics.DisableDiagnostics

Closing

Please try out these improvements in the .NET Framework and let us know what you thinkPlease share your feedback in the comments below or on GitHub.

How to get system diagnostics working on environments deployed in customer/partner subscriptions

$
0
0

With a purchased Azure subscription, customers and partners can deploy a Microsoft Dynamics 365 for Finance and Operations, Enterprise edition environment by using the Cloud Hosted Environments feature in LCS. Due to an issue in the environment deployment flow, system diagnostics was configured incorrectly in environments that were deployed after August 30th 2017. Because of this, key features in LCS such as the Updates tile, Performance dashboards, and other features are broken. To verify whether your environment is impacted by this issue, use the following list of symptoms to compare to your environment.

Symptoms:

·         System Diagnostics and Detailed Version Information is not available from the LCS Environment page.

·         The LCS Environment page URL contains IsDiagnosticsEnabledEnvironment=false

·         The logs in the local administrator %AppData%RoamingJobLog folder contains invalid algorithm specified errors.

     

To work around this issue and get the Update tiles, system diagnostics, and SQL Insights performance dashboards working again, complete the following steps.

  1. Login to LCS and select the Shared Asset Library tile on the dashboard.
  2. Click the Model tab and download the System Diagnostics Hotfix folder.
  3. Using the local administrator account, remote into the machine that you want to fix. The account information is available on the Environment details page.
  4. Copy the hotfix archive to C:Temp and extract it.
  5. From an elevated PowerShell command prompt, navigate to C:Temp'Extracted hotfix folder' and run the LcsDiagHotfix_EnvironmentRegistration.ps1 script. When prompted, enter the password of the 'axdbadmin' user which is available on the Environment details page in LCS.

To run the script without being prompted for input or shown any output, use the following command line with the correct file paths and password for the database user:

PowerShell.exe -NonInteractive -Command "& {C:TempLcsDiagHotfix_EnvReg_20170929LcsDiagHotfix_EnvironmentRegistration.ps1 -LogFilePath C:TempLcsDiagHotfix.log -DatabaseUserPassword (ConvertTo-SecureString -AsPlainText -Force -String '<Password>'); Exit $LastExitCode;}"

This will produce detailed output in C:TempLcsDiagHotfix.log and return a non-zero exit code in case of any errors.

  1. After the script has completed, check the most recent LCS Diagnostics data collection logs in %AppData%RoamingJobLog. The latest log should show that a collection has started on this environment.
  2. Return to the environment in LCS and verify that the symptoms listed above are now fixed. The Update tiles take more time to refresh, but you should see detailed version information and the URL flag updated for this environment.
  3. If, after completing these steps, you still run into issues, log a support incident.

How to negotiate an audio format for a Windows Audio Session API (WASAPI) client

$
0
0

The Windows Audio Session API (WASAPI) provides a family of interfaces for playing or recording audio.

Chief among these are the IAudioClient, IAudioClient2, and IAudioClient3 interfaces.

There is a Windows audio session (WASAPI) sample on GitHub, but in this blog post I want to dive into the nitty-gritty of one particular question:

How do I decide what WAVEFORMATEX to pass to IAudioClient::Initialize*?
*Or equivalent

Before I answer this question, let's take a look at some of the relevant methods on these interfaces.

  1. IAudioClient2::SetClientProperties is a way for you to tell Windows some things about the audio stream before actually creating it (by passing an AudioClientProperties structure.)
    The client properties you specify can affect the answers to some of the questions you ask Windows, so be sure to set this BEFORE calling any of the other methods.
  2. IAudioClient::GetMixFormat gives you the audio format that the audio engine will use for this client (with its given AudioClientProperties) to mix all the similar playback streams together, or to split all the similar recording streams apart.
    This format is guaranteed to work*, but sometimes there is a better format that also works.
    * Unless you use AUDCLNT_SHAREMODE_EXCLUSIVE, or AudioClientProperties.bIsOffload = TRUE.
  3. PKEY_AudioEngine_DeviceFormat gives you the audio format that the audio engine uses after the playback mix to talk to the audio driver for the speaker, or to talk to the audio driver for the microphone before splitting the recording streams apart.
    This format is guaranteed to work with AUDCLNT_SHAREMODE_EXCLUSIVE.
    If the audio device has not been used with AUDCLNT_SHAREMODE_SHARED yet, the format may not have been calculated, and the property will be empty.
    You can force the format to be calculated by calling IAudioClient::GetMixFormat.
  4. IAudioClient::IsFormatSupported lets you ask Windows whether the client (with its given AudioClientProperties) supports a given format in a given share mode.
    In certain cases (e.g., AUDCLNT_SHAREMODE_SHARED), if the client does not support the format in question, Windows may suggest a format which (Windows thinks) is close.
  5. IAudioClient::Initialize considers the previously given AudioClientProperties; takes the WAVEFORMATEX you have decided on; and takes a set of flags, including AUDCLNT_STREAMFLAGS_XXX flags.
    The two interesting flags for format negotiation are AUDCLNT_STREAMFLAGS_AUTOCONVERTPCM and AUDCLNT_STREAMFLAGS_SRC_DEFAULT_QUALITY, which tell Windows that you want the WASAPI audio engine to do any necessary conversions between the client format you are giving it and the playback mix format or recording split format.
    This will work for uncompressed integer PCM and uncompressed floating-point client formats, but will not work for compressed formats like AAC.

OK, with all that background out of the way, let's try to answer the question. There are several approaches which will work.

  1. Use a higher level audio API instead of WASAPI
    This is the preferred approach. WASAPI is complicated. And even with all that complication, it's a very underpowered API - for example, it doesn't even do MP3 decoding.
    No matter what your application is, there is almost always a higher-level audio API which is better suited for you. If you are not sure what it is, send me an email and ask me; I might be able to recommend one for you, or I may be able to put you in touch with someone else who can.
    A few examples of higher-level audio APIs: MediaElement, MediaCapture, AudioGraph, XAudio2.
    If you've tried a higher-level API, but you've run into some problem or other and now you're resorting to WASAPI, email me and tell me about the problem; I want to fix it so we can get you back on the right API for you.
  2. If you don't care what format is used
    IAudioClient::SetClientProperties(...);
    use IAudioClient::GetMixFormat // no need to call IsFormatSupported here
  3. If you have a format in hand
    Maybe you're playing audio from a file, or maybe you need to record from the microphone and hand off to a DSP library that insists on a particular input format.
    If that is the case, use the following pattern:
    IAudioClient::SetClientProperties(...);
    if (IAudioClient::IsFormatSupported(formatInHand)) { use that }
    else { use IAudioClient::GetMixFormat, or the suggested closest supported format, and convert between that and formatInHand in the app code }

    Another option which will work, but which is less preferred, is to use this pattern:

    IAudioClient::SetClientProperties(...);
    IAudioClient::Initialize(formatInHand, AUDCLNT_STREAMFLAGS_AUTOCONVERTPCM | AUDCLNT_STREAMFLAGS_SRC_DEFAULT_QUALITY)

    Since you're using WASAPI directly (see point 1 above!) you will need to compress/decompress the audio in app code.

Or in tabular form (because people like tables)

AUDCLNT_SHAREMODE_SHARED AUDCLNT_SHAREMODE_EXCLUSIVE
Any format is fine
IAudioClient2::SetClientProperties(...);
use IAudioClient::GetMixFormat()
IAudioClient2::SetClientProperties(...);
use PKEY_AudioEngine_DeviceFormat
I have a particular format I want to use
IAudioClient2::SetClientProperties(...);
if (IAudioClient::IsFormatSupported(yourFormat)) { use it }
else { use the suggested closest-supported-format and convert between it and yourFormat in app code }

or

IAudioClient2::SetClientProperties(...);
IAudioClient::Initialize(yourFormat, AUDCLNT_STREAMFLAGS_AUTOCONVERTPCM | AUDCLNT_STREAMFLAGS_SRC_DEFAULT_QUALITY)
IAudioClient2::SetClientProperties(...);
if (IAudioClient::IsFormatSupported(yourFormat)) { use it }
else { use PKEY_AudioEngine_DeviceFormat and convert between it and yourFormat in app code }

Regardless of which approach you use, you should always have some assurance that the format will work before calling IAudioClient::Initialize.

You could get this assurance in various ways - IAudioClient::GetMixFormat, IAudioClient::IsFormatSupported, or AUDCLNT_STREAMFLAGS_AUTOCONVERTPCM | AUDCLNT_STREAMFLAGS_SRC_DEFAULT_QUALITY.

It is an application bug to call IAudioClient::Initialize blind.

A simple approach for strengthening the security of your Microsoft account

$
0
0

Microsoft account has offered for several years now a feature that is very useful, although it may not be as well known as it deserves to be. That feature is aliases. In this post, I want to go over the use of aliases and show how they can be used to strengthen the security of your account.

Let's start with a hypothetical user: Alice Holden. Alice is creating a new Microsoft account of the form alice_holden@outlook.com. At this point, this is her only alias and is also her account's primary alias.

The alias feature allows Alice to assign additional aliases to her account. For example, she may want to use a shorter email address, so that she can type it more easily on devices; she can then simply create a new alias: aholden@outlook.com. But there are more uses. Alice may also find herself in situations in which she needs to provide an email address for contact, but does not want the address to reveal her identity, as her current aliases do. To deal with such scenarios, she could create a new alias incognito@outlook.com (in reality, she would have to be more inventive with the name, as this alias would not be available). Alice can now provide incognito@outlook.com as a contact without concern that it reveals her name and gender.

At this point, Alice has three aliases for her account:

  1. alice.holden@outlook.com (primary)
  2. aholden@outlook.com
  3. incognito@outlook.com

Note that for all purposes these act as account names. An external party using these aliases for communication would not be able to distinguish the primary alias from the other aliases, or figure out that they represent the same account (they may suspect that alice.holden and aholden represent the same account, but they cannot be 100% sure that the same account is behind both).

But alias management provides another extremely useful feature: it allows you to restrict the ability to login to only a few select aliases. Alice can make use of this feature to prevent her generic incognito@outlook.com alias from being used for authentication, because she only created it for communication with parties that she does not fully trust. With this change, the status of her aliases becomes:

  1. alice.holden@outlook.com (primary)
  2. aholden@outlook.com
  3. incognito@outlook.com (no auth)

Alice can go one step further though. Even though she restricted access through the incognito alias, her other aliases are easy to guess, plus they can get leaked from security breaches at parties that Alice trusted with that information. What Alice can do next is to create a private alias that she only uses for authentication and that she never provides for contact. She can then make this her primary alias and then she can block all other aliases from being used for login (the primary alias cannot be blocked, so that is why it has to be changed). The requirements for naming this alias would be that it should be hard to guess, although we don't necessarily have to go to the lengths to which we go for a password. Let's say Alice creates alias whinmf50@outlook.com, marks it as primary account, and removes the ability to authenticate from all other logins. Her aliases would look like this:

  1. whinmf50@outlook.com (primary)
  2. alice.holden@outlook.com (no auth)
  3. aholden@outlook.com (no auth)
  4. incognito@outlook.com (no auth)

With this configuration Alice can provide alice.holden@outlook.com as a contact to all trusted parties, can still provide aholden@outlook.com as a shortcut to friends, and can share widely the incognito@outlook.com contact, without concern that either of them can be used to authenticate into her account. An attacker would not only have to discover her password (and her second factor of authentication if she has set one), but also discover her private alias. This is not impossible, but is an additional challenge that an attacker would have to bypass. I mentioned here briefly a second factor of authentication - the reason I did not include it in the examples was simply because I wanted to keep them simple and focused on the management of aliases; in reality, Alice would also set up a second factor of authentication on her account. Note, however, that the steps described here are additive (orthogonal) to any other security measures you adopted for your account: you don't lose anything by adopting them but you do gain an additional layer of security by making your primary alias another secret. A corollary of this being an additive measure is that it should be added to something - you should not consider this approach a substitute for using a strong password or for using a second factor of authentication.

For this setup to be effective, Alice would have to be careful not to leak her whinmf50 alias by, for example, sharing it on her profile page or by using it to send email. Since the primary alias tends to be displayed sometimes by the services that use Microsoft account authentication, Alice should also be careful with sharing the contents of her screen. The primary alias basically needs to be treated as a secret. If there is any evidence that the alias has been discovered - for example, by receiving spam messages specifically sent to it - then Alice can simply generate another secret alias and mark it as primary; she can then remove the old alias - she doesn't need to notify anyone about the change because she never used the old alias for any communication.

At this point, even if this entire approach sounded interesting, you may still have a concern about whether there are any annoyances that come with changing your account's primary alias. I admit that I had a concern too, but I decided to go through this exercise a week ago and change the primary alias of my account to a freshly created one. I am happy to report that everything transitioned smoothly. The only action I had to take was when one app asked me to sign in again and even then I am not sure if this was due to the change or just a coincidence. It really is a smooth process. Furthermore, Outlook is still composing my email messages using my public alias, which I believe I may have selected as default alias in Outlook settings at some point in the past.

I hope this post gives you a better appreciation of the value of aliases for protecting your account. And as a last piece of advice, keep checking your account's activity history from time to time, to see if there are any unrecognized attempts to break into your account.

OneDrive not updated after upgrade to Windows 10 1709

$
0
0

For whatever reason, OneDrive did not update in one of my machines after I upgraded to the new Windows 10, 1709.

I was able to bump it up by downloading the setup file from here:
https://oneclient.sfx.ms/Win/Prod/17.3.7064.1005/OneDriveSetup.exe

 

I found the address in the AppDataLocalMicrosoftOneDriveStandaloneUpdaterUpdate.xml file.
It was pointing to the old version, so I've got the new version number from another machine and downloaded the installer directly.

I suppose https://onedrive.live.com/about/en-us/download/ will eventually point to that as well (or a similar link).

Release notes for October update for Field Service and Project Service Automation

$
0
0

With the goal of continuously improving quality, performance, usability, and responding some customer feature feedbacks, we recently released an update in October for Dynamics 365 for Field Service, Dynamics 365 for Project Service Automation, below are the new capabilities and bug fixes introduced in this update.   

Field Service (v 7.1.0.33) Enhancements  

Below are some new capabilities enabled for this upgrade release: 

  • Added a new view that shows only Field Service based Quotes and Field Service based Orders and link it to sitemap in Field Service. 
  • Added switch to turn off address suggestions at Field Service settings. 
  • Handled querying Service Territory offline to improve performance.  

Below are the major bug fixes for this upgrade release:  

  • Schedule assistant is not displaying resources as per Requirement Resource Preference (Restricted/Preferred) on IE11 Browser. 

Refer to Universal Resource Scheduling Enhancements section for other scheduling related enhancements and bug fixes.  

NOTE: This upgrade release can only be installed/upgraded for Dynamics 365 9.0+ org. 

 

Project Service Automation (v2.1.0.30) Enhancements  

Below are some new capabilities enabled for this upgrade release  

  • Added Billing Type field on expense tax invoice line details. 
  • Added Role (resource category) column between Task Id and Transaction Category columns for Actual associated view. 
  • Improved performance by avoiding unnecessary WBS aggregation on update task. 
  • Localized label and better description for invalid action on MS Project label. 

Below are the major bug fixes for this upgrade release 

  • Time entry created in the week of DST transitions to Standard time shows up on the following day.  
  • Importing Estimate lines onto Quote line from Project for a 2nd time results in an error "record is unavailable". 
  • Contract performance does not show milestone amount in the Billed amount for FP line. 
  • "Record Is Unavailable" error is shown after navigating to and deleting the cost side detail record from a quote line detail. 
  • WBS view UX issue with column heading width and Gantt scrollbar. 
  • European number formatting not respected on the quick create UI for estimated hours when creating project from template. 
  • In MS Project, after Find Resources and book a resource, the resource sheet is not refreshed. 
  • Hitting "This action is not allowed for projects linked to MS Project." error when trying to book a team member on MSP-link project, with non-contiguous booking slots. 
  • Error pop-up when deactivating Resource Request. 
  • Generic resource is not using work hour template from project. 

Refer to Universal Resource Scheduling Enhancements section for other scheduling related enhancements and bug fixes  

NOTE: This upgrade release can only be installed/upgraded for Dynamics 365 9.0+ org   

Universal Resource Scheduling Enhancements  

Below are the major bug fixes for this upgrade release 

  • Schedule board error when time zone set to GMT-3 Brasilia.  
  • Schedule board shows no resources available until switching from Hours view to Day view. 
  • Map pins are not refreshed when moving to next page of resources while in RM. 
  • Requirement map pin loses focus when searching for availability.
  • Handle escaping requirement name on Schedule Board. 
  • Maintain Bookings not opening in the correct view. 
  • Cancel bookings route also showing in the mini map in schedule board. 
  • Hide inactive resource characteristics from resource fly out. 
  • Booking duration and percentage is not changing when cancel the booking after the moved bookings to different day.Cannot sort or filter fields added to requirement view on schedule board from other entities. 
  • Changing territory filter on board does not take immediate effect on the requirement tabs when Apply Territory Filter is enabled. 
  • On Schedule board, inconsistency in calculating the available capacity between hourly and daily view. 
  • Duration value is not updated when the requirement detail is deleted. 
  • Incorrect duration time on view details tooltip template in RM mode. 
  • On click of "Load Default filter" not clearing all controls in Filter control. 
  • Resource driving directions print window, print icon is missing next to print label. 
  • Add Fulfilled/Remaining Duration fields to the Requirement form. 

NOTE: Enhancements and bug fixes for Universal Resource Scheduling apply for Field Service and Project Service Automation as well as other schedulable entities. 

For more information: 

 

Feifei Qiu 

Program Manager  

Dynamics 365, Field Project Service Team

Upcoming events- #MicrosoftEdu UK Roadshow

$
0
0

Fast approaching is the launch of our new and exclusive this year #MicrosoftEdu UK Roadshow series. The Microsoft UK Education Roadshow will help fulfil our mission to empower the students and teachers of today to create the world of tomorrow. With over 100 events taking place across the UK in 2017 and 2018, this is the perfect opportunity for educators to see first-hand how Microsoft technologies can enhance teaching and learning.
Events are completely FREE and perfect for those at the very beginning of their digital transformation journeys. All events will involve hands-on training workshops led by our specialist Microsoft Learning Consultants and/or Microsoft Training Academies, and will focus specifically on how Office 365 and Windows 10 can help transform learning.


Check out the Sway below to find out more.

 


Roadshow Events for October

30th October 2017~Derby College, The Roundhouse Campus,Roundhouse Road, Pride Park, Derby DE24 8JE, United Kingdom. Sign up here.


Visit the Microsoft Educator Community UK Roadshow page to find out about the events near you and sign up.

Our aim is to reach every corner of the UK, so if you are able to host a Roadshow in your locality then please contact us on the e-mail: Eduroadshow@microsoft.com. 


Quick Reference : ASP.NET and Windows Authentication

$
0
0

Authentication is a process by which the system validates a user's logon or sign-in information. A user's name and password are verified and if found correct , access is granted . Windows Authentication is a very complex topic and this post will help you learn a quick overview of windows authentication with Asp.net .

Windows authentication (formerly named NTLM) is a secure form of authentication used in intranet environment to authenticate windows users against Microsoft Active Directory. Now  windows users(also called as NT users) are any user created by administrators such as mydomain/bob, euro/Alice for login into your machine.Think fo Active Directory is a place where all the users,passowords etc are stored and this is setup again by Administrators.

Say you are a developer of an application which should be used over intranet by all the people in your company .In this case ,  Windows authentication can be used to authenticate your company's internal users aka Domain users  to your  asp.net application .So the flow will be like

(1) Domain user(e.g. mydomain/username)  =>(2) ActiveDirectory (e.g. MyDomain)=>   (3)ASP.NET website

  1. any domain user or a local user e.g. mydomain/rohith ,machinename/Bob
  2. this step happens transparently without you doing any configuration or code as a developer
  3. Configure windows authentication in your asp.net/website.

To configure windows authentication, you can refer this article on windows authentication or this .Or with Asp.net specific refer this KB article

Windows authentication supports two authentication protocols, Kerberos and NTLM .But in Configuration you will see 3 setttings Negotiate, Kerberos, and NTLM .Negotiate, is a wrapper for Kerberos  and NTLM, allows the client application to fallback to NTLM if Kerberos is not supported. When you install and enable Windows authentication on IIS 7+ , the default setting is Negotiate.

Windows authentication(You can choose NTLM or Keberos in the IIS settings) can be used under following circumstances :

  • Client computers and Web servers are in the same domain.
    • There is no firewall or proxy in between and this is best case scenario where your users will  be able to authenticate using the same login they used to access their machines.
  • Users are required to access the application over internet
    • you can make windows authentication work over internet over NTLM but this is not recommended  .
    • Kerberos requires that the client have a direct connection to Active Directory, which is generally not the case in Internet scenarios

References :

https://technet.microsoft.com/en-us/library/cc732841(v=ws.11).aspx

https://docs.microsoft.com/en-us/iis/configuration/system.webserver/security/authentication/windowsauthentication/

https://www.iis.net/configreference/system.webserver/security/authentication/windowsauthentication?showTreeNavigation=true

 

Git と Visual Studio 2017 その 11 : 構成

$
0
0

前回の記事では、Visual Studio 2017 を使ったソリューションの共有を説明しました。今回は Git の構成について見ていきます。Git は構成を変更することで様々な動作を変更出来ます。

構成のスコープ

Git の構成には 3 つのスコープがあります。より狭いスコープの設定が最終的に適用されます。

システム: 同じ PC を使うすべてのユーザーに適用
グローバル: 特定のユーザーのすべてのレポジトリに適用
ローカル: 特定のレポジトリにのみ適用

‘git config’ コマンドで構成を操作しますが、スコープの指定をしない場合はローカルの構成が変更されます。詳細はこちら

構成 : Git

まず Git における構成を見ていきましょう。

1. ‘git config -l’ を実行して全スコープから現在の構成を表示。

image

2. ‘git config --local -l’ を実行してローカルスコープの設定を取得。

image

3. ‘git config user.name’ を実行して現在のユーザー名を表示。

image

4. ‘git config user.name <yourname>’ を実行してユーザー名を設定。スコープの指定を省略しているため、ローカルスコープで設定。グローバルスコープで設定したい場合は、‘git config --global user.name <yourname>’ を実行。GitHub この名前をもってユーザーを特定

image

おすすめの構成設定

話はそれますが、個人的にお気に入りの設定を紹介します。

core.pager: Git で log コマンドの結果などを表示する際、どのように結果を出すか指定できます。既定は less ですが、空欄にすることですべての情報が一気に出せます。
core.autocrlf: Windows と Linux のように改行コードが異なる環境で設定が必要になります。詳細はオフィシャルサイトを見てください。
alias.<alias>: 独自ショートカットを作成できます。log などはオプションが多いため、“lol” というショートカットに ‘log --oneline --graph --all’ を設定したりしています。
help.autocorrect: コマンドを打ち間違えた場合に自動補正してくれる機能です。パラメーターとして渡す数字は 1 が 0.1 秒となります。意図したコマンドになると困るので、一応 50 を設定することが多いです。
color.status: 各結果表示の色を変えることが出来ます。画面ショットからもわかる通り、特に赤色は見ずらいので変えることが多いです。例えば ‘git config color.status.changed yellow’ および‘git config color.status.untracked magenta’ にするとステータスは以下のようになります。

image

構成 : VS

Visual Studio 2017 での構成を見る前に、VS に関連する設定を見てみましょう ‘git config --global -l’ を実行して、マージと差分ツールを見てください。ここの設定があるため ‘git mergetool’ などで VS が起動するようになっています。

image

では VS での Git 構成を見ていきましょう。

1. チームエクスプローラーより設定をクリック。”グローバル設定” がグローバルスコープ、”レポジトリの設定” がローカルスコープに相当。”グローバル設定” をクリック。

image

2. ユーザー名や電子メールアドレスは Git の構成として反映されるが、”既定のレポジトリの場所” などは Visual Studio 自体の設定として保持される。”既定でマージ後に変更をコミットする。” 設定はマージの回で説明したが、他の設定は詳細リンクを参照。Git だけを再インストールしたなどの理由でマージや差分ツールが Visual Studio から変更されてしまった場合は、こちらの画面から再設定が可能。

image

3. 戻って ”レポジトリの設定” をクリック。ユーザー名などグローバルスコープを上書きする設定と、ローカルスコープの設定がともに存在。.gitignore や .gitattributes ファイルは編集リンクより編集が可能。またリモートのように Git 観点では構成ではないものも、こちらで設定。

image

4. .gitignore などが VS のエディターで変更できるのは便利。参考として既定の設定確認も推奨。

5. その他の設定も、読み取り専用だが数多くの参考となる設定が可能。

まとめ

構成は目立たないものの重要なため、どのような値となっているかは都度確認するようにしています。チームで同じプロジェクトをやる場合は構成を合わせたりもしています。次回は履歴などからのファイルの比較について見ていきます。

中村 憲一郎

“Unexpected error from external database driver (1). (Microsoft JET Database Engine)” after applying October security updates.

$
0
0

 

We have been seeing a recent influx in cases where the JET provider is no longer able to connect after the October update. This update (released October 10th, 2017) includes a security update release that inadvertently affects the JET provider. The update was kb4041678 and included in the patch kb4041681.  These patches affected the Operating System, which adversely has an issue with the following technologies: Microsoft Windows Search Component, Windows kernel-mode drivers, Microsoft Graphics Component, Internet Explorer, Windows kernel, Windows Wireless Networking, Microsoft JET Database Engine, and the Windows SMB Server. It is important to note that the changes were not to these technologies themselves.

 

Types of errors witnessed:

 Unexpected error from external database driver (1). (Microsoft JET Database Engine)

 [Microsoft][Driver ODBC Excel] Reserved error (-5016).

 [Microsoft][ODBC Excel Driver]General Warning Unable to open registry key 'Temporary (volatile) Jet DSN for process

 

WORKAROUNDS & SOLUTION:

 

Approach 1:

Use Microsoft.ACE.OLEDB.12.0 or Microsoft.ACE.OLEDB.16.0: (Recommended)

The following updates where not intended to cause any issue with Microsoft Jet Database Engine 4.0, at the same time the product group developers were not verifying these updates would be compatible with Microsoft Jet Database Engine 4.0 data provider as it had been deprecated back in 2002:

https://support.microsoft.com/en-us/help/4041678/windows-7-update-kb4041678

https://support.microsoft.com/en-us/help/4041681/windows-7-update-kb4041681

As both articles suggest for the below workaround.

 

In all current known cases, using the ACE provider works to connect to the excel files in lieu of the JET provider. The following download is the most up to date version for the ACE provider:

Microsoft Access Database Engine 2016 Redistributable

https://www.microsoft.com/en-us/download/details.aspx?id=54920

 

When looking into this issue, the largest thing to note is: The JET provider has been deprecated as of 2002. The last changes were made to this in 2000. See the following article for more details.

Data Access Technologies Road Map

https://msdn.microsoft.com/en-us/library/ms810810.aspx

Excerpt:

Microsoft Jet Database Engine 4.0: Starting with version 2.6, MDAC no longer contains Jet components.
In other words, MDAC 2.6, 2.7, 2.8, and all future MDAC/WDAC releases do not contain Microsoft Jet, the Microsoft Jet OLE DB Provider, the ODBC Desktop Database Drivers, or Jet Data Access Objects (DAO).
The Microsoft Jet Database Engine 4.0 components entered a state of functional deprecation and sustained engineering and have not received feature level enhancements since becoming a part of Microsoft Windows in Windows 2000.”

 

So, in short the JET provider has been working for a good 15 years after deprecation, but this most recent update caused a change which requires an update to how you are connecting to the Excel file. For the SSIS packages, we recommend pointing to the Excel by our Excel connector instead of using OLEDB.

You can locate the Excel Connector by opening up an SSIS package within SQL Server Data Tools. Create or go to an existing Data Flow task. You can see the Excel Source in the “Other Sources” Section:






When using JET, this is done through an OLEDB/ODBC source. You can use the same method for the ACE provider. The ACE provider will work, however it is not supported for use with programs such as SSIS, Management Studio, or other applications. Although this doesn’t tend to cause issues, it is important to note. That notwithstanding, the ACE drives provides the same functionality as the JET provider did. The only limitation I am aware of is the same you were encountering with JET which is that you can't have multiple users connecting to and modifying the Excel file.

Microsoft Access Database Engine 2016 Redistributable

https://www.microsoft.com/en-us/download/details.aspx?id=54920

Excerpt:

The Access Database Engine 2016 Redistributable is not intended:
 4. To be used by a system service or server-side program where the code will run under a system account, or will deal with multiple users identities concurrently, or is highly reentrant and expects stateless behavior.
Examples would include a program that is run from task scheduler when no user is logged in, or a program called from server-side web application such as ASP.NET, or a distributed component running under COM+ services.

 

I understand that there are many users that don’t know how many packages might be experiencing this issue.  It is possible to look through your dtsx packages using the following method.

If you run a command prompt as an administrator, you can use the find command to look through the packages for the keyword JET.  This will state all the files it looks through, but you will see a connection manager result for the files that have this in them. If you have already updated the package from JET to ACE, then this will not show the connection manager and is not affected by this security update.

In a command prompt:

Find /I “Jet” “C:SampleFolderSamplePackage.dtsx”

Alternatively, you can search the whole folder at once:

Find /C /I “Jet” “C:SampleFolder%.dtsx”

For more information on the Find command:

Find:  https://technet.microsoft.com/en-us/library/bb490906.aspx?f=255&MSPPError=-2147217396

 

Additionally after the installing and upgrading to ACE providers, you may run into the following error message,

The 'Microsoft.ACE.OLEDB.16.0' provider is not registered on the local machine
Or
The 'Microsoft.ACE.OLEDB.12.0' provider is not registered on the local machine

If you are in a scenario where you have both 32-bit and 64-bit processing being performed on the same server that needs to use the ACE data provider, from the Microsoft standpoint, it is recommended to have 2 different servers (1 to process 32-bit mode and the other for 64-bit mode).

But there is an option where you can have both the versions installed on the same machine by performing a “/quiet” install of these components from command line. To do so, download the desired AccessDatabaseEngine.exe or AccessDatabaeEngine_x64.exe to your PC, open an administrative command prompt, and provide the installation path and switch /quiet. Ex: C:FilesAccessDatabaseEngine.exe /quiet. This approach is documented in Microsoft Access Database Engine 2016 Redistributable download page under the Additional Information section.

 

Approach 2:

Uninstall the security patch (Not recommended):

This patch seems to update Excel Jet connector from V4.00.9801.0 to v4.00.9801.1.

One of the workarounds is to uninstall the KB and it may fix the issue. Although, in some instances it may not help to resolve the issue.

 

Approach 3:

Registry change (Not recommended):

Another workaround would be to update the below registry key to point to an old copy of the DLL file:

[HKEY_LOCAL_MACHINESOFTWAREWow6432NodeMicrosoftJet4.0EnginesExcelWin32]

To get an old copy of the DLL, uninstall the patch KB4041681, copy the DLL "msexcl40.dll" from C:WindowsSysWOW64msexcl40.dll to a new location say "C:\msexcl\msexcl40.dll".

You can now modify the registry key [HKEY_LOCAL_MACHINESOFTWAREWow6432NodeMicrosoftJet4.0EnginesExcelWin32] to point to the new DLL location "C:\msexcl\msexcl40.dll" (by default it’d be pointing to C:WindowsSysWOW64msexcl40.dll)

 

Other workarounds discussed online:

There is a public forum discussion where many customers found various ways to work around this issue.

ODBC Excel Driver Stopped Working with "Unexpected error from external database driver (1). (Microsoft JET Database Engine)"

https://social.msdn.microsoft.com/Forums/en-US/2feac7ff-3fbd-4d46-afdc-65341762f753/odbc-excel-driver-stopped-working-with-unexpected-error-from-external-database-driver-1?forum=sqldataaccess

 

Solution:

The best recommended solution is to move to Microsoft ACE OLE DB provider.

Apart from this, Microsoft is working on a resolution and will provide an update in an upcoming release of the security patch. This is expected to be available in another 2-3 weeks or earlier.

If you are still encountering any related issues, please reach out to the Microsoft CSS team.

 

DISCLAIMER:

THE ABOVE INFORMATION IS PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE.

 

Authors:   

Chrone Meade - Support Engineer, SQL Server BI Developer team, Microsoft

Jon Herman - Sr Support Engineer, SQL Server BI Developer team, Microsoft

Krishnakumar Rukmangathan – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

Windows 10 Fall Creators Update に搭載される Microsoft Edge の新機能

$
0
0

社内の大きなトランスフォーメーションにより所属が変わり、直接 Edge には関わらなくなってしまいましたが、できるかぎり Edge の情報をお伝えしていきますよ、ということで 10 月 17 日 (日本では 18日) から提供されている Windows 10 Fall Creators Update に搭載される Edge の新機能について紹介します。

Windows Fall Creator Update とは?

Windows 10 に対して行われる年 2 回の機能アップデートのうちのひとつです。

Windows Blog の記事によれば「September and March of each year」となっていますが今は 10月の半ばです。不思議ですネ。

今回の Windows Fall Creator Update で提供される新機能は以下のページで紹介されておりますのでぜひご覧ください。

この記事では Windows 10 Fall Creators Update に搭載される Microsoft Edge の新機能について紹介します。

Microsoft Edge (EdgeHTML16)の新機能

Windows 10 Fall Creators Update に搭載される Edge は、一目でその存在がわかるような派手な機能の追加はありませんが、着実にブラッシュアップアップされています。

とはいえ、実際にユーザーの目に触れる UI とデザイン箇所の新機能から順に紹介していきたいと思います。

UI デザインと機能

ボタンアニメーションの改善

Fluent Design System にインスパイアされ、Microsoft Edge のブラウザーフレームよりモダンにリフレッシュしました。アクリルマテリアルの使用は、タブバーや他のコントロールに奥行きと透明性を提供し、より応答性と楽しさを感じるようにボタンのアニメーションを改善しました。

単語と行の強調表示による読み上げ

WebサイトやPDFからのテキストを選択して右クリックし、コンテキストメニューから「読み上げ」を選択すると、テキストが音声で読み上げられます。選択内容が読み込まれ、その際に単語や行が強調表示されます。

コンテンツの読み上げ

Microsoft Edgeのフルスクリーンモード(F11)

フルスクリーン機能が正式に導入されています。Edge では、これまでもキーボードの [Shift] + [Win] + [Enter] キーを押下して全画面表示が可能でしたが、この方法は正式にはアナウンスされおらず、ある意味隠しコマンド的なものでした。

今回のアップデートからは、[F11]キーを押下するか、設定メニューの新しいフルスクリーンアイコンを選択するだけで、Web サイトをフルスクリーンで表示できます。[F11]キーをもう一度押すか、右上隅にある復元アイコンをクリックして、全画面表示を終了することができます。

フルスクリーンモードのメニュー


JavaScript ダイアログボックスが表示されていても終了可能に

これまでの Edge または他の Web ブラウザーでも、JavaScript の alert コマンドで表示されるダイアログボックスが表示された状態では、終了することができませんでた。

一部の悪質な Web サイトでは、この仕様を悪用して任意の操作を行うまでダイアログを表示し続けるということが現在も行われています。

しかし、今回のアップデートでは、JavaScript ダイアログボックスが表示されている場合でも閉じるボタンを使用して正常に Edge を終了することができるようになりました。

アドレスバーとタブ

アドレスバーのデザインを調整

ユーザーからのフィードバックに基づいてアドレスバーのデザインを調整しました。これで、アドレスバーにフォーカスがない場合でも、テキストをクリックしてドラッグしたときに開始されても、カーソルの下に残ります。以前は、「http://」の表示に合わせてテキストがシフトしました。この変更により、URLの一部を簡単に編集することが容易になりました。

また、URL バーにフォーカスが当たったときに URLバーのテキストが移動し、URL の特定の部分を編集しにくくなる問題が修正されました。このビルドから、URL をクリックしてもカーソルの下のテキストは同じままになります。

タブの閉じ方をより詳細に制御

前述の「JavaScript ダイアログボックスが表示されていても終了可能に」のところでも紹介しましたが、Microsoft Edge で JavaScript(警告、プロンプトなど)のダイアログが表示されている場合でも、タブを閉じるためにタブバンドの X を常に使用できるようになりました。

ダイアログが表示されている間も、多くのブラウザ機能(お気に入りバー、設定など)にアクセスすることもできます。

開閉時のアニメーション表示

新しいタブは、開閉時にタブバー上にスムーズにアニメーション表示されるようになりました。

セッションの復元の動作が改善され、(たとえば電子メールからの)リンクをクリックしてマルチウィンドウの Edge のセッションを復元すると、復元の最後にフォーカスが当てられたウィンドウが新しいリンクを含むウィンドウになります。

スプラッシュページの色の遷移がスムーズに

Microsoft Edgeのスプラッシュページ(新しく起動したときに表示される)の色がよりスムーズにスタートと新しいタブ ページに遷移できるように。

改善されたお気に入りのエクスペリエンス

新しいお気に入りを保存するときの新しいエクスペリエンス

新しいお気に入りを保存するときに、お気に入りをディレクトリツリーとして表示できるようになり、[お気に入りに追加] ダイアログボックスからフォルダを折りたたんだり展開したりすることができます。

お気に入りのURLを編集

お気に入りメニューまたはお気に入りバーで任意のお気に入りのURLを編集できるようになりました。これを使用して、移動したサイトの場所を更新したり、ブックマークレットをお気に入りバーに作成したりすることができます。



エンタープライズのお気に入り管理機能

IT 管理者は、グループ ポリシーを定義し、ユーザーのお気に入りに加えて事前構成済みのお気に入りをロックする機能など、モバイル デバイスの管理を介してお気に入りを構成できます。

お気に入りのウェブサイトをタスクバーにピン留め

ユーザーのフィードバックにより、タスクバーとスタートページへの Web サイトのピン留め機能が復活しました。サイトのアイコンを使用して、タスクバーからすぐにお気に入りのサイトにすばやくアクセスできるようにします。使い方は Edgeの設定メニューから「このページをタスクバーに固定する」を選択するだけです。

PDF 関連の機能

Microsoft Edge PDFの改善

PDF で [Cortana に質問] にハイライトカラーとオプションを追加しました。

PDF フォームの入力

PDF ベースのフォームを記入し、保存して印刷する機能が追加されました。

PDF のアノテーション(注釈)

Web ノートアノテーション機能は、Web ページだけでなく PDF でも動作するようになりました。ブラウザフレームの右上にある「Web ノートを作成」ボタンを使用して、アノテーションバーを呼び出すことができます。

アノテーションバーで異なるモードを使用すると、PDF にインクを印刷したり、テキストをハイライト表示したり、アノテーションを消去したりできます。後で使用するために、作業状況を PDF ファイルに保存することができます。

目次

目次を含む PDF 文書の場合、PDF ツールバーの左側に目次用の新しいボタンが表示されます。サイドペインの見出しをクリックすると、ドキュメントのその部分に移動します。

Screen capture showing the ToC in Edge

より優れた表示とナビゲーション

より読みやすくするために PDF ドキュメントを回転させるコントロールを追加し、1 ページから 2 ページのレイアウトに切り替えたり、長いドキュメントのナビゲーションエクスペリエンスを改善するために、連続したページごとのスクロールの間隔を変更したりします。

EPUBの読書の改善

コピーと Cortana への質問

テキストを選択すると、コピーや Cortana に質問できるだけでなく、ノートを追加したり、強調表示したり下線を引いたりすることができます。EPUBの本を読んでいる間、Cortana はユーザーの研究を手助けします。

Screen capture showing Ask Cortana in an EPUB document

インクノート

これまで、ノートを追加し、ペンで書き込んだり、描画したりする機能は Web ページのみで可能でしたが、EPUB でも可能になりました。

Screen capture showing Ink Notes in an EPUB document

ブックに注釈を付ける

4 色でハイライト、下線、およびコメントの追加によって EPUB 書籍を注釈を付ける機能を追加しました。開始するには、テキストを選択し、メニューからオプションを選択します。

Screen capture showing an annotated EPUB book in Microsoft Edge

本をデバイス間でローミング

Windows ストアからの書籍は Windows 10 デバイスで間で読み取りの進行状況、ブックマーク、およびメモが同期され保持されます。

ただし、Windows ストアででの EPUB 書籍の販売は現在米国の Windows ストアでのみ行われています。

開発者向けの機能

Web プラットフォーム

Progressive Web Apps (PWA) のサポートは?
Microsoft Edge Dev Blog のこの記事によれば、will be coming to preview builds of Microsoft Edge for developer testing this summer. とのことでしたが、PWA を構成する Service Worker や Web Application Manifest、Push API、Background Sync API などのステータスが [IN DEVELOPMENT] のままなので、もう少々かかるようです。

管理者向けの機能

グループポリシーオブジェクト

Microsoft Edge には Windows ドメイン環境下で管理者が設定を一括管理するためのグループポリシーが用意されています。

Windows 10 Creators Updateにより、Microsoft Edge には、3 個の新しいグループポリシーが追加されました。

説明 設定
Microsoft Edge に書籍ライブラリを常に表示する Windows 設定の [国または地域] 領域で構成したデバイスの国または地域設定に関係なく、[書籍] タブを表示するかどうかを決定
お気に入りのプロビジョニング ユーザーに表示されるお気に入りの既定のセットを構成できる
Microsoft Edge 上のお気に入りを変更できないようにする この設定を有効にした場合、ユーザーはお気に入りリストの内容を追加、インポート、または変更できません

その他、Microsoft Edgeに指定可能な既存のグループポリシー設定については、以下のドキュメントをご参照くださいませ。

その他の進歩

描画性能の向上 - 独立したレンダリング

EdgeHTML 16 ではグラフィックス処理を追加のCPUスレッドに選択的にオフロードすることができるため、シルク - スムーズスクロール、応答性の高い対話、fluid アニメーションなど、ユーザーインターフェイス スレッドと全体的な可視パフォーマンス特性ページへの影響を最小限に抑えてレンダリングできます。EdgeHTML 16 では、以下の要素を完全にサポートすることで、より多くのサイトで独立したレンダリングが可能になりました。

  • <select>コントロール
  • <canvas>要素
  • 特定の<svg>要素

この改善点の詳細については以下の記事をご覧ください。

Windows Defender Application Guard

昨年9月にMicrosoft Edge Blogで発表されました Windows Defender Application Guard for Microsoft Edge がエンタープライズユーザー向けに利用できるようになりました。

Application Guard 上で実行される Microsoft Edgeは、Windows に対するゼロデイ攻撃とマルウェアからの最大レベルの保護を企業に提供します。

また、Application Guard の使用中に Microsoft Edge データの永続性がサポートされました。有効にすると、お気に入り、Cookie、保存されたパスワードなどのデータは、Application Guard セッション全体で保持されます。永続化されたデータはホスト上で共有されたり表示されたりすることはありませんが、将来の Microsoft Edge では Application Guard セッションで使用できるようになります。

Screen capture showing a WDAG tab in Edge

まとめ

今回の記事では、Windows 10 Fall Creators Update に搭載される Microsoft Edge (EdgeHTML16) の新機能について紹介しました。

Microsoft Edge は、ユーザーからのフィードバックを参考に着実に進化しています。

Microsoft Edge へのフィードバックと評価は、メニュー[…](設定) – [フィードバックの送信] からお願いします。

また、開発チームからの反応や、その後の進捗の状況が気になる方は Microsoft Edge Platform Suggestion Box に投稿を行うのをお勧めします。

また私も引き続き Edge、Web 関連の情報を発信していきますので、@samum_MS のフォローもよろしくお願いします。


Real Time Analytics

Clicky

Future Decoded 2017–Register Now

$
0
0

Future Decoded 2017

Future Decoded is Microsoft's annual UK conference which describes itself as "A vision of the modern digital business for today and tomorrow". This year, the conference is held on 31st October & 1st November 2017 at the ExCeL, London.

If you are not already registered, you can register for free at https://futuredecoded.com 

Unlike previous years, technical content is split across both days so register for

image

Tuesday 31st October 2017

Industry Insights & Futures

Aimed at professionals and leaders from companies of every size, you will explore the impact technology, AI & quantum will have on your industry and your profession.

The Open Cloud

Developers & IT Pros, we will explore cloud & open source technology with leading figures from Microsoft and Open Source Communities

Wednesday 1st November 2017

Azure summit: Digital Disruption Today and Tomorrow

For enterprise business & technology decision makers, you will see how organisations old and new are using the cloud to fundamentally change their businesses.

Tech Deep Dives

Deep technical content, by techies for techies, with a focus on code, scripts, data and demos for all developers, data people and IT Pros.

We aim to show you how you can use Azure and AI services like Cognitive Services, Bot Framework and Cortana Skills to bring AI to your applications and much more from  Machine Learning and Neural Networks, the Data & Machine Learning track deals with that to Quantum Computing.

Please use the session builder to add these tracks to your schedule if you are attending Future Decoded.

Register Now

https://www.futuredecoded.com/content/tickets

Viewing all 29128 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>