Quantcast
Channel: MSDN Blogs
Viewing all 29128 articles
Browse latest View live

Announcing the 2018 Imagine Cup Online Finals

$
0
0

Microsoft’s Imagine Cup is a global competition that offers student developers a chance to win up to USD100,000, Azure credit, and mentoring for creating cloud-based solutions that will change the world.  Each year Imagine Cup qualifying competitions take place around the world with students submitting incredible projects powered by Azure.  Top projects in each qualifying competition are selected to advance to the World Finals and chance to compete against the best in the world for prizes and the title Imagine Cup World Champion.

For students around the world that do not have the opportunity to compete in a national or regional Imagine Cup final, we have launched the Imagine Cup Online Final as a platform to submit projects directly to Imagine Cup headquarters in Redmond, WA to be judged for a chance at qualifying for the Imagine Cup World Finals!

What to see if you qualify?

  1. Read the 2018 Imagine Cup Official Rules to make sure you qualify
  2. Register for the 2018 Imagine Cup
  3. Visit your account page to find out which competition you qualify for
  4. Turn your ideas into realty and submit your Imagine Cup project by April 30, 2018

Best of luck to all Imagine Cup competitors, we can’t wait to see what you create!


Troubleshooting Tools for App Service Certificate

$
0
0

This blog post describes the various tools available to you to debug any issues with App Service Certificates  resources that you may be using with Web App or other Azure Services.  SSL is a critical part of your application and when configuring the certificate or renewing the certificate there can be a few issues you might run into . These tools listed below can help provide you information to self-debug the issue in most cases.

  1. Verify Status of your Certificate : Check if the Status of your certificate is ready for use. Sometimes the certificate might have Domain verification step pending and this status tile can help provide information on what steps you need to take 
  2. Debug using Timeline : View the list of historical activities or operations that have occurred on App Service Certificate resource using the Timeline feature to help debug the issue 
  3. Sync a Certificate : The Web App service runs a background job that periodically (once a day ) that syncs all App Service certificate. Hence when you rotate or update a certificate, sometimes the application is still retrieving the old certificate and not the newly updated certificate.  This  is because the job has not run to sync the certificate resource.   To force a sync of the certificate , you can click on Rekey and Sync setting and then click on Sync button .
  4. Refer to FAQs   documentation  : Get access to appropriate documentation to App Service certificates to help resolve the issue with Configuration , Rekey and Sync , Renewal etc  .

 

If the above tools dont help you resolve the certificate related issues , then please contact Microsoft Azure Support.

Experiencing Data Access Issue in Azure and OMS portal for workspace data – 02/20 – Investigating

$
0
0
Update: Tuesday, 20 February 2018 11:43 UTC

We continue to investigate issues within Azure Log Analytics service. We have identified an issue with latest monitoring agent upgrade which is causing our system to not process the data as expected. Customers using Microsoft
Monitoring Agent (MMA) extension to connect Azure Virtual Machines (VMs) to Log
Analytics who may experience the MMA extension stuck in error or transitioning
state in either the Azure Portal or the OMS Portal. On-premises VMs
which have the MMA extension installed are not impacted.
 The initial findings indicate that the problem began at 02/15 19:30 UTC. We currently have no estimate for resolution. 
  • Work Around: Rebooting the impacted VM will help in mitigating the issue
  • Next Update: Before 02/21 00:00 UTC

-Mohini Nikam


Update: Tuesday, 20 February 2018 05:31 UTC

We continue to investigate issues within Azure Log Analytics service. We have identified an issue with latest monitoring agent upgrade which is causing our system to not process the data as expected. Some customers continue to experience issues while accessing workspace data in Azure portal as well as in OMS portal. The initial findings indicate that the problem began at 02/15 19:30 UTC. We currently have no estimate for resolution. 
  • Work Around: Rebooting the impacted VM should help in mitigating the issue
  • Next Update: Before 02/20 10:00 UTC

We are working hard to resolve this issue and apologise for any inconvenience.

-Mohini Nikam


Initial Update: Tuesday, 20 February 2018 01:29 UTC

We are aware of issues within Azure Log Analytics services and are actively investigating. We have identified an issue with latest monitoring agent upgrade which is causing our system to not to process the data as expected. Some customers may experience issues while accessing workspace data in Azure portal as well as in OMS portal. We provide more updates as we learn about fixing the issue though we have workaround (reboot of VM) to mitigate this problem.
  • Work Around: Rebooting of impacted VM should help in mitigating the issue
  • Next Update: Before 02/20 05:30 UTC

We are working hard to resolve this issue and apologize for any inconvenience.
-Praveen

Know your Azure Subscription Quota and Usage

$
0
0

There are two ways by which you can face the Azure Subscription limit,

  1. Wait till you script fails provision the resources
  2. Check regularly on important services and their limit.

Obviously the second option is the desired one and PowerShell do give few easy to run commandlets

Few things you need,

  1. Login to Azure using PowerShell

  2. Then check which subscription it is pointing to. If you are part of multiple subscription then you need to point to the right one.
  3. To know the subscription you are part of run

    This would give you the list of subscriptions and pickup the right GUID

  4. Then run the following command with the GUID

Check the VM Usage

Check the Network Usage


Check the Storage Account Quota


There are couple of them as well. To get them all

In fact, you keep checking. In my machine between PowerShell modules Network has been added. So, expect more to come.

Now these are all in separate commands. Let's combine them together and generate an output in Excel (CSV)

Sharing a Board Maker app through the Microsoft Store

$
0
0

This post describes the steps taken to have a board maker app available at the Microsoft Store, using the Desktop Bridge. It also stresses how an app doesn't need to do much to be helpful – it just needs to do what's needed.

Apology up-front: When I uploaded this post to the blog site, the images did not get uploaded with the alt text that I'd set on them. So any images are followed by a title.

 

Introduction

A few years ago I was in contact with an organization which works with kids with severe disabilities, many of whom are non-verbal, and I suggested that I build an app which could have custom text spoken by the computer when the student clicks on a button with whatever type of input device they were using. I'd already built many exploratory apps that included that functionality, and I knew .NET and Visual Studio made it practical for me to build this sort of app quickly. The goal would then be to update the app to make it as useful as possible for the students. For example, some of the students have low vision, and so I felt it might be helpful for the various elements in the UI to be able to appear with customizable colors and sizes.

Some of the visual customizations I made to the app would have been straightforward to make if I'd built the app as a UWP XAML app. And to have done so would have meant it would be simple for me to get the app up at the Microsoft Store and to manage updates to it. But when I first built the app, a number of people who would use it had machines running on Windows 7. Because of that, I built a WinForms app which could run on both Windows 7 and Windows 10 computers.

Before long, I had a communication tool, whereby the student could click on a button to have a specific announcement made. And over time the app evolved into a board maker app, where at any given time, it would present a set of buttons and when a particular button was clicked, the app would make some announcement and then would update its UI to show a different set of buttons. What's more, in addition to text being shown on the buttons, an image or video could be shown, and audio could be output when the button is clicked. All this functionality is practical for me to achieve thanks to the power of .NET. A collection of features available in the app is shown at 40 things to do in the free Herbi Speaks board-making app.

While working on this app, I was reminded of a very important point:

An app doesn't have to do much to be helpful – it just needs to do what's needed.

 

Many developers would consider the app in its earlier forms to have too few features to be helpful. And even with its larger set of features, it might be considered too unintuitive or just plain unprofessional to be something that anyone would want to use. And certainly, there are many ways in which the app could be improved, and it does have constraints that have led to questions from some customers. And something I thought long and hard about is the fact that the app's not accessible. For example, it can't be used only with a keyboard. But that said, I had to ask myself, is it better to make the app available to people, than not to? Should I hold off on making the app available until I've sorted out all the potential improvements, or should I share it with people who would find it helpful as it is, and prioritize improvements based on feedback? In the end I decided to share it out.

The screenshot below shows a demonstration Herbi Speaks board with three buttons on it. One button shows a question mark, and when that button is clicked, the app says "Where is Jasmine?". Another button shows a sheep, and the last button shows Jasmine the spaniel. Typically for such a board, when the Jasmine button is clicked, the app would make a helpful audio announcement or perhaps play a video of Jasmine, and then refresh to present a different board. Note that the app is running in its fullscreen mode, meaning that there is no other UI presented which the person using the app can interact with.

 

Figure 1: A demonstration Herbi Speaks board showing three buttons.

 

Was it worth building the app?

Yes. The app's being used at around 1:30 mins in Herbi Speaks in action, and while I clearly had a bug to fix in the app at the time, the video did show me how the app could play a useful part in the student's education.

 

I also received the following feedback from the mother of another person who uses the app.

 

"The Herbi Speaks app has been very helpful for us. The personalized boards we can make ourselves, with pictures and sounds our son relates to and really enjoys, ….

There are limitless topics we can use to interest him. We only need to use our imagination. He much prefers sounds attached to the images to really keep him interested. For instance, I made 30 boards all linked together of fish and other animals that live in water. They start out very simple with just one fish in the middle of the blue screen. The fact that we can pick the colors we want on everything is great! As he goes along I add in water plants on the screen to make it a bit harder to find the fish. I downloaded the fish/animal images from google search as well as water sounds that happen after the text speaks. Number 10 is a whale that has a nice long whale song. Toward the end is a dolphin and the sound it makes.

My severely disabled son has learned to breeze through all 30 different boards in 10 to 15 minutes. The challenge for us is in finding what does interest him. When we make a hit it is amazing to see what he can do. I have made many boards now on different subjects. Our goal of course is to get him to the point where he can actually communicate his wants, needs and feelings to us. Right now he is just having fun playing on the computer and learning what he can do."

 

I find this sort of feedback extremely helpful. And while I'm pleased that the app is proving to be of some use as it is today, it does make me wonder how I can make it more useful to this family. Is it practical for me to make the app more helpful as a communication tool for the person using the app?

 

And this really brings up the crux of all this. Many developers reading this will feel they could build an app like Herbi Speaks, or in fact something much higher quality. And I'm sure they could. So is it practical for you to build a simple app, with a very focused feature set, which helps plug a gap in what's available to someone today? If you do, I'd strongly recommend that you consider sharing what you've built with the world through the Microsoft Store. In my experience, leveraging the Microsoft Store is way more pleasant than messing around with my own app setup processes.

The following discussion relates to my experiences with making the Herbi Speaks app available at the Microsoft Store.

 

Making the WinForms app available at the Microsoft Store

I'd recently made another WinForms app available at the Microsoft Store, as described at Sharing assistive technology through the Microsoft Store, so I took a look at my comments there first, to reduce the chances of me missing a step. And below is how things went working on getting this second app up at the Store.

 

Preparing the app for submission to the Store

In Visual Studio, I added a Desktop Bridge Packaging Project to the solution containing my WinForms project. I already had a version of Visual Studio which supports packaging projects. When doing the "Add New Project" thing, it's "Windows Application Packaging Project" beneath Visual C# -> Windows Universal.

I then referenced the WinForms app in the packaging project, by right-clicking on the packaging project's Application item, then Add Reference, and then selecting the WinForms app.

When I then tried to build the packaging project, I did get a couple of errors. One error related to a mismatched processor type between the WinForms app and the packaging project. I'd hit that before, when preparing the other WinForms app for the Store. In both cases I addressed this by going to the Properties for the WinForms project, then Build, and changing the Platform Target from what I happened to previously have it set as, "x86", to "Any CPU". The other build error I got related to the version of the .NET Framework I was targeting in the build. It turned out I was targeting .NET 4, which I'd probably selected when I first built the app. I changed that to .NET 6.1, as I expect my customers will already have at least that on their machines now, and then the packaging project built fine.

Then it was a case of preparing the package for the Store. When I first considered the idea of moving the app to the Store a while back, I reserved the app name at the Store, so I didn't have to do that now. But I did have to associate the new packaging project with that reserved name. I did that by right-clicking on the packaging project, selecting Store, and then "Associate App with the Store".

My next step was to have the app package that I was building verified as being ok to upload to the Store. And sure enough, when I ran the verifier after building the package, I was told that the verification failed. I was told that the package was using the placeholder image assets. I tripped up on exactly the same thing when I worked on my previous app, and you'd think I'd have learnt, but apparently not. So again, it was really handy to get this early notification, rather than it being detected further down the line.

And having hit that, I then decided I'd try to be a little more proactive rather than reactive for a change. Last time I went through this process, my app wouldn't run after deploying from the package. That was because I'd built my WinForms app in such a way that an image would be required at run-time, but the image wasn't available to the exe after it was deployed. I was pretty sure that there wouldn't be an issue this time relating to images, but I was wondering about a couple of interop dlls that the app used. Would they be included in the package as the app needed them to be? Well, that turned out to be just fine. So long as my WinForms app is set up with the related assembly in the list of app references, and that assembly has a Copy Local property of True, then the packaging project will automatically include what's needed in the package. And I could verify that that was the case by deploying the Packaging Project on my machine, and making sure the app ran as expected. Nice one Visual Studio!

 

Submitting the app to the Store

It was now time to go through the steps of submitting the app to the Store. This started off being routine, but it was interesting for me when I got to the checkbox that said "This product has been tested to meet accessibility guidelines". I left that box unchecked, given that I know the app needs to be enhanced in a number of ways before I'd call it accessible.

I then submitted the app to the Store, and before long got told that it had failed certification. There were two reasons for that. The first is that I'd not supplied a link to a Privacy Policy. Such a link is required, because the app has access to my customers' user data folders. That's a requirement of the app, because my customers can associate images and media with buttons shown in the app, and the file browser in the app can take my customers to their Pictures and Music folders. And by default, the app also saves the file containing the data on the boards created in the app, in a folder beneath the Documents folder. So I created a privacy policy at my own site, and duly added a link to that when I later resubmitted the app at the Store.

The other reason why the submission failed certification was that the app had not gone through the full process required for submitting desktop apps at the Store, whereby I complete the form at Have you already converted your app with the Desktop Bridge? and then someone from the Store team works with me to confirm that the desktop app can be made available at the Store. I intentionally hadn't done that on this occasion, as I'd gone through that process with the previous app, and wasn't sure if it was required for all subsequent apps. But it turns out that I did need to go through the process for this second app. Fair enough.

This is where things got really interesting. After I'd supplied an app package to the person from the Store team for verification, he told me the app didn't start.

 

Make sure what you're supplying to the Desktop Bridge is good

When the person from the Store team let me know that the app didn't start, he sent over an event log which helped me learn that the error seemed related to my use of the .NET AxWindowsMediaPlayer control. (That's a really handy control, which makes it easy to add audio and video functionality to a WinForms app.) So as a test, I copied over my appx package to another machine with a recent version of Windows 10, and deployed the app there. To my surprise, the deployed app didn't start on that machine either. It turned out that Windows Media Player (WMP) wasn't available on that machine, and so I decided to assume that that was the same issue on the machine used by the person from the Store team. And it seemed reasonable to assume that if WMP isn't available on a machine, then my use of the AxWindowsMediaPlayer control could be impacted.

This is where I got distracted. I told myself that I should address this by updating my Desktop Bridge Packaging Project to have it set up the media-related components required by the app, if those components weren't already available on the target machine. So I would add whatever redistributable was required to the Packaging Project. I failed to find such a redistributable, but then someone pointed out that the issue here isn't the Desktop Bridge Packaging Project anyway. Rather, it's my WinForms app. The app isn't going to run on the target machine regardless of whether it's deployed through an appx package, or simply run as a WinForms exe. So I need to figure out what to do about my WinForms app first, and consider the Desktop Bridge after that.

So while it seems pretty obvious now, I should always make sure my WinForms story is rock-solid first. If I've got holes in that story, I shouldn't be trying to figure out how to plug those with changes to my appx package.

And then someone pointed out that WMP is available in Windows 10, but at some point it became an optional feature which is not available by default. So when the app's run, the AxWindowsMediaPlayer control may or may not be available. I don't know the recommended way of detecting whether the WMP feature's available at run-time, so I decided to add some exception handling around my AxWindowsMediaPlayer initialization. If I hit an exception, then I'll assume the problem is the feature's not available, and pop up a message to let the customer know what's going on. (Granted, I can't be sure that the exception is always related to the optional feature not being set up, but I'm hoping in practice it will be.)

The screenshot below shows the message now popped up by the app. It contains the following text.

The audio and video features of the Herbi Speaks app are not available on this computer. To enable these features, please visit the "Apps & features" page in the Windows Settings app. Then go to the "Manage optional features page" and add the "Windows Media Player" feature. When the Herbi Speaks app is then restarted, the audio and video features should be available.

 

Figure 2: A message box shown by the app when the app fails to initialize its media-related feature on startup.

 

In order to test this, I uninstalled the WMP optional feature on my dev machine. As an added twist to this, once I'd uninstalled the WMP feature, I couldn't build my app in Visual Studio. So I had to add the WMP feature through the Settings app to build my app, and then uninstall the feature to test my app. But after going back and forth a few times with adding and uninstalling the WMP feature, I'd updated the app such that my customers are made aware of how they can access the media features in the app, if those features are not available by default.

Note: For me, it was important to be using the Windows 10 Settings app to review the state of the WMP feature. The classic control panel also has "Turn Windows features on or off" functionality, but I couldn't find any reference to Windows Media Player there.

 

And so the app becomes available at the Store

My next step was to supply an updated appx package to the person from the Store team, who could then deploy and run the app, and perform the required verification steps. With his sign-off I could submit the appx package to the Store, and follow the standard steps for submitting any app. Given my learnings around the Windows Media Player, I added the following related details to the app's Release Notes which would appear in the Store:

If the Herbi Speaks app is running on a version of Windows 10 which does not have the Windows Media Player optional feature enabled, then audio and video cannot be played in the Herbi Speaks app. To enable audio and video in the app, enable the Windows Media Player optional feature from the Windows 10 Settings app.

And with that, the app soon became available at Herbi Speaks at the Microsoft Store.

 

Summary

There are many ways in which the Herbi Speaks app could be improved. For example, making it useable with a wider variety of input and output mechanisms. And giving the people who are setting up the boards some clue as to what to do when the app first runs, would be nice. But even with its current constraints, the feedback I've received lets me know that the app can be very helpful as it is today, and as such, I should make it available to as many people as I can. For me, this means making the app available through the Microsoft Store.

Please do consider whether it'd be practical for you to build a helpful tool which plugs a gap in what's available to someone today. And remember, the app really doesn't need to do much, it just needs to do what's needed. And if you do build such an app, consider sharing it with the world through the Microsoft Store.

Guy

More Showplan enhancements – Row Goal

$
0
0

Cross post with http://aka.ms/sqlserverteam

As I shared before, we have been working for over a year to make showplan the one-stop-shop for query performance analysis and troubleshooting (as much as possible).

With the recent release of SQL Server 2017 CU3, we released yet more showplan enhancements: you can see other posts related to showplan enhancements here.

In this article I’ll talk about one of these showplan improvements, to assist in the discoverability of Optimizer row goal use, and its impact in query execution. A new operator property EstimateRowsWithoutRowGoal. This will also be available in the upcoming SQL Server 2016 SP2.

So what is row goal?

When the Query Optimizer estimates the cost of an execution plan, it usually assumes that all qualifying rows from all tables have to be processed. However, some query patterns cause the Optimizer to search for a plan that will return a smaller number of rows, with the purpose of doing it faster. So row goal is a very useful optimization strategy for certain query patterns.

How is row goal used?

This can occur if the query specifies a target number of rows (a.k.a. row goal) that may be expected at runtime. When a query uses a TOP, IN or EXISTS clause, the FAST query hint, or a SET ROWCOUNT statement, that row goal is used as part of the query optimization process. If the row goal plan is applied, the estimated number of rows in the query plan is reduced, because the Optimizer assumes that a smaller number of rows will have to be processed, in order to reach the row goal.

When row goal is very low, and a JOIN is required, then the Optimizer prefers a nested loops join, because its initial cost (the cost to produce the first row) is relatively low. But other types of JOIN may also be used if the row goal is larger:

  • A hash join is usually a good choice for joining larger inputs. Although it has a higher initial cost, because it has to build a hash table before any rows can be returned, once built, the hash join is generally cheaper.
  • But if the two join inputs are sorted on their join column (for example, if they were obtained by scanning sorted indexes), a merge join is the fastest join operation and can be chosen instead.

Example of row goal benefits

Let’s use the following query in Adventureworks2016, and look at the resulting query execution plan and execution metrics:

SELECT TOP (100) *
FROM Sales.SalesOrderHeaderBulk AS s 
    INNER JOIN Sales.SalesOrderDetailBulk AS d ON s.SalesOrderID = d.SalesOrderID
WHERE s.TotalDue > 1000
OPTION (RECOMPILE);
GO

Notice the nested loops plan driven by the low row goal (100):

image

image

This query executed in just over 1.6s with 40ms of CPU time. Let’s keep these in mind for later.

image

Looking at the properties of the outer input for the nested loops, notice this scan returned 1550 rows (Actual Number of Rows), and only read the 1550 rows that satisfy the pushed down predicate (Number of Rows Read). Good!

And we see row goal optimization was used to benefit performance, because without row goal, the estimated rows would be about 6M.

We can see this information in the new operator property EstimateRowsWithoutRowGoal. In this plan, the new property can be seen in this Clustered Index Scan, and also in other operators up the tree, like the Compute Scalars.

This new property is only added to a plan operator if row goal was evaluated and used – if not, this property is absent.

 

 

 

 

 

In fact, we can use a USE HINT to disable row goal optimization, and see how this really helped our query plan shape and execution metrics:

SELECT TOP (100) *
FROM Sales.SalesOrderHeaderBulk AS s 
    INNER JOIN Sales.SalesOrderDetailBulk AS d ON s.SalesOrderID = d.SalesOrderID
WHERE s.TotalDue > 1000
OPTION (RECOMPILE, USE HINT('DISABLE_OPTIMIZER_ROWGOAL'));
GO

 

image

image

Clearly, row goal was a benefit, as the execution time increased to just over 2.4s (60% worse). Row goal is a benefit most of the times it is used.

Example of row goal that we can improve upon

Most of the times it;’s a benefit, but not always clear. Which is why the new information can be useful. Let’s see an example using another query:

SELECT TOP 250 *
FROM Production.TransactionHistory H
INNER JOIN Production.Product P ON  H.ProductID = P.ProductID
OPTION (RECOMPILE)
GO

The resulting query execution plan is using a nested loops, as expected for a low row goal. For reference, also notice the execution metrics:

image

image

This singleton query executes very fast, no question. But let’s imagine you were seeing these numbers x100, and this query executed many times per minute.

I can see in the Clustered Index Scan below, how the estimated rows without row goal (EstimateRowsWithoutRowGoal) is much larger than the number of rows read (Number of Rows Read): 113K to 250. Row goal drove some decisions in the Optimizer that seem beneficial, and produced a nested loops plan with the specific order of outer and inner tables – TransactionHistory and Product, respectively. On the inner table Product, SQL Server does just a seek for the required 250 lookups, as expected.

image image

Keeping in mind the metrics above, and because we see row goal was used, I can try to disable row goal, and see if I get better performance.

SELECT TOP 250 *
FROM Production.TransactionHistory H
INNER JOIN Production.Product P ON  H.ProductID = P.ProductID
OPTION (RECOMPILE, USE HINT('DISABLE_OPTIMIZER_ROWGOAL'))
GO

The resulting query execution plan now uses a hash join, and notice how the join inputs have changed order: Product is the build table, TransactionHistory the probe.

image

What about execution metrics? Better, no question.

image

The build table Product only has 504 rows (see below), and once built, it’s much cheaper than a nested loop, which is what I see in this example. Even though the probe table TransactionHistory is much larger, now we need only probe the required 250 rows for our TOP clause.

So disabling row goal is better in this case, yielding a 89% improvement in execution time (165ms to 18ms).

imageimage

Before this showplan improvement, if you were tasked with analyzing a query using a pattern that may be using row goal, you could only guess if it was present. But with the EstimateRowsWithoutRowGoal property, it becomes possible to see if row goal was in fact used, and then engage in these tuning exercises.

Pedro Lopes (@sqlpto) – Senior Program Manager

Be vigilant against phishing attempts

[SfBO] 役職、部署、顔写真を削除しても反映されない

$
0
0

こんにちは、Japan Skype/Lync Support チームです。
Skype for Business Online (AD同期環境、ハイブリッド環境等)で役職や部署、顔写真を変更すると、時間がかかる場合はありますが反映されます。

ところが、役職、部署、顔写真を「削除」した場合、反映されず前の情報が残ったままとなります。

これは現時点での制限事項となります。
Skype for Business Online では各属性について定期的に更新情報を受け取り反映させています。(すべての情報ではなく「更新」を反映させています)

しかしながら「削除」については変更があったことを認識できないため、そのままの状態となります。
本現象については問題であると認識しておりますが、修正時の影響範囲などを考慮して現在までのところ修正されておりません。

大変お手数ですが別の値を入れていただくか空白・白い画像等をインプットしていただきご対応いただければと存じます。

 


Data source bindings for TFS/VSTS REST APIs

$
0
0

Overview

Data source bindings essentially bind a drop-down input field in UI (e.g. task input) which needs to be dynamically populated with values with the corresponding REST API that needs to be invoked to fetch the list of values.

In case of REST APIs supported by external VSTS services (e.g. Azure, TeamCity, BitBucket etc.), data source binding takes the endpoint Id that needs to be used to query the values & details of REST API.

For e.g. AzureRmWebAppDeployment task defines data source binding referring to the above data source:

{

  "target": "SlotName",

  "endpointId": "$(ConnectedServiceName)",

  "dataSourceName": "AzureRMWebAppSlotsId",

  "parameters": {

  "WebAppName": "$(WebAppName)",

  "ResourceGroupName": "$(ResourceGroupName)"

  },

  "resultTemplate": "{"Value":"{{{ #extractResource slots}}}","DisplayValue":"{{{ #extractResource slots}}}"}"

}

The following blog describes the different fields of the data source binding:

https://blogs.msdn.microsoft.com/sriramb/2016/09/15/service-endpoints-data-sources/

However, there are scenarios where inputs would need to be populated with values obtained from VSTS REST APIs. For e.g. getting a list of VSTS Build Definitions, VSTS feeds/packages etc.

These queries do not require an endpoint to be created and specified in the data source binding.

In order to support such queries, data source bindings support taking endpointId in the format:

tfs:{service}

For e.g. DownloadPackage task defines the following dataSourceBinding:

{

"target": "feed",

"endpointId": "tfs:feed",

"endpointUrl": "{{endpoint.url}}/_apis/packaging/feeds",

"resultSelector": "jsonpath:$.value[*]",

"resultTemplate": "{ "Value" : "{{{id}}}", "DisplayValue" : "{{{name}}}" }"

},

Supported VSTS Services

Below are the set of VSTS services we currently support within dataSourceBindings:

tfs:teamfoundation – Any micro service hosted within TFS (e.g. Build, Test etc.)

tfs:packaging – Packaging service

tfs:feed – Feed service

tfs:rm – Release Management service

Note that data source bindings using these services will work seamlessly in TFS as well as VSTS.

Support for tfs:packaging & tfs:feed is added with TFS 2018 Release.

Support for tfs:rm is added with TFS 2018 Update 2 Release.

VSTS REST APIs are documented here: https://docs.microsoft.com/en-us/rest/api/vsts/

 

SET IMPLICIT_TRANSACTIONS Behavior On Azure SQL Data Warehouse and APS

$
0
0

Working with transactions in Azure SQL Data Warehouse (ADW) and the Analytics Platform System (APS, aka PDW) is a bit different than one would expect. Though most of us tend to operate under the default behavior with IMPLICIT_TRANSACTIONS OFF, developers that interact with ADW/APS using other languages may find transaction handling using their language constructs to not work as expected.

Before we begin . . .

Let's first consider the following sample Java program:

import java.sql.*;  

public class TrxTest {  
   public static void main(String[] args) {  
		// Create a variable for the connection string.  
		//String connectionUrl = "jdbc:sqlserver://localhost;instanceName=SQL2016;databaseName=SmpHotnessDB;user=SmpUser;password=*****";
		String connectionUrl = "jdbc:sqlserver://MppHotnessServer.database.windows.net;databaseName=MppHotnessDB;user=MppUser;password=*****";

		// Declare the JDBC objects.  
		Connection connection = null;  
		Statement statement = null;  
		ResultSet rs = null;  

		try {
		// Establish the connection.  
		Class.forName("com.microsoft.sqlserver.jdbc.SQLServerDriver");  
		connection = DriverManager.getConnection(connectionUrl);
		 
		statement = connection.createStatement();
		statement.execute("CREATE TABLE dbo.transaction_001 (id INT)");
		statement.execute("INSERT INTO dbo.transaction_001 (id) VALUES (1)");

		// Try one transaction
		// BEGIN TRANSACTION
		connection.setAutoCommit(false);
		statement.execute("DELETE FROM dbo.transaction_001 WHERE id = 1");
		statement.execute("INSERT INTO dbo.transaction_001 (id) VALUES (10)");
		connection.commit();
		connection.setAutoCommit(true);
		// END TRANSACTION

		// Following statement fails with "Operation cannot be performed within a transaction"
		statement.execute("CREATE TABLE dbo.transaction_001_new (id INT)");
      }  

      // Handle any errors that may have occurred.  
      catch (Exception e) {  
         e.printStackTrace();  
      }  
      finally {  
         if (statement != null) try { statement.close(); } catch(Exception e) {}  
         if (connection != null) try { connection.close(); } catch(Exception e) {}  
      }  
   }  
}

Assuming you have both an SMP instance and an MPP (ADW/APS) instance handy, you can run the above and switch between the two platforms to test the differences. When running against an SMP instance, everything works as one would expect. However, when executing against ADW/APS, you receive the following error:

JavaError

What's Happenin'

The error you receive from ADW/APS is due to an enforced restriction on these platforms. You are likely familiar with (and perhaps frustrated by) the fact that DDL operations can't be wrapped in a transaction. Regardless, this error is admittedly confusing because the setAutoCommit(true) method should have committed any open transactions, leaving the open transaction count at 0. Though SMP cleans up when the connection.setAutoCommit(true) statement is issued, ADW/APS doesn't (exactly). In short, this is because ADW/APS will immediately open a transaction following a SET IMPLICIT_TRANSACTIONS ON statement; more than this, ADW/APS will start a new transaction even after a COMMIT TRAN is issued when IMPLICIT_TRANSACTIONS is ON. As such, ADW/APS perpetually maintains a transaction count of (at least) 1 while IMPLICIT_TRANSACTIONS is ON. To contrast, SMP won't issue a new transaction until a new DML or DDL statement is issued after the COMMIT TRAN statement. You can test this behavior by issuing the following statements (which are effectively what the Java program sends to the database with some insert SELECT @@TRANCOUNT statements) one at a time against each different platform:

SELECT @@TRANCOUNT

CREATE TABLE dbo.transaction_001 (id INT)
INSERT INTO dbo.transaction_001 (id) VALUES (1)
SELECT @@TRANCOUNT

set implicit_transactions on;
SELECT @@TRANCOUNT

DELETE FROM dbo.transaction_001 WHERE id = 1
INSERT INTO dbo.transaction_001 (id) VALUES (10)
SELECT @@TRANCOUNT

IF @@TRANCOUNT > 0 COMMIT TRAN
SELECT @@TRANCOUNT

IF @@TRANCOUNT > 0 COMMIT TRAN
SELECT @@TRANCOUNT

set implicit_transactions off
SELECT @@TRANCOUNT

CREATE TABLE dbo.transaction_001_new (id INT)
SELECT @@TRANCOUNT

Make It Right

This difference in behavior complicates existing coding patterns against SQL Server SMP when applied to ADW/APS because a manual COMMIT TRAN must be issued in order to properly close out a transaction even after setAutoCommit(true) is executed:

// BEGIN TRANSACTION
connection.setAutoCommit(false);
statement.execute("DELETE FROM dbo.transaction_001 WHERE id = 1");
statement.execute("INSERT INTO dbo.transaction_001 (id) VALUES (10)");
connection.commit();
connection.setAutoCommit(true);
statement.execute("if @@trancount > 0 commit tran");
// END TRANSACTION

When this is in place, IMPLICIT_TRANSACTIONS is OFF and all open transactions have been properly closed.

Before We're done - IMHO . . .

One could argue that the behavior of setAutoCommit(true) could be improved to issue the following:

IF @@TRANCOUNT > 0 COMMIT TRAN
set implicit_transactions off
IF @@TRANCOUNT > 0 COMMIT TRAN

Instead of:

IF @@TRANCOUNT > 0 COMMIT TRAN
IF @@TRANCOUNT > 0 COMMIT TRAN
set implicit_transactions off

Alas, it doesn't . . . so you have to account for this. And this is the point: you can't necessarily always control how a given package/library/API implements its transaction control with a database; such is the case in the above scenario. However, you can better control transactions in ADW/APS if you:

  1. Leave IMPLICIT_TRANSACTIONS = OFF (aka, setAutoCommit(true)) and
  2. Utilize stored procedures to manage transactions rather than do it from another language's implementation

Helpful Links

https://docs.microsoft.com/en-us/sql/connect/jdbc/reference/setautocommit-method-sqlserverconnection
https://docs.microsoft.com/en-us/sql/t-sql/statements/set-implicit-transactions-transact-sql
https://docs.microsoft.com/en-us/sql/connect/jdbc/building-the-connection-url

Thanks to my technical reviewers: Allan Miller & Charl Roux

Performance Degradation in South Central US – 02/20 – Mitigated

$
0
0

Final Update: Tuesday, February 20th 2018 22:54 UTC

We’ve confirmed that all systems are back to normal as of 21:20 UTC. Initial investigation shows an issue with a database in that scale unit. We are engaging our partners in SQL Azure to understand the root cause. Users should not face any more issues or errors accessing the service.

Sincerely,
Sri Harsha


Initial Update: Tuesday, February 20th 2018 21:50 UTC

We're investigating Performance Degradation in South Central US. A subset of customers have experienced intermittent timeouts and 404 errors accessing VSTS. The impact is currently mitigated (lasted between 21:05 and 21:20 UTC) but we are working to isolate the error source.

  • Next Update: Before Tuesday, February 20th 2018 23:00 UTC

Sincerely,
Sudheer Kumar

2/21 Webinar: Be a Full Stack Power BI Jedi – A walkthrough of Power BI most advanced features through Star Wars data

$
0
0

Be a Full Stack Power BI Jedi – A walkthrough of Power BI most advanced features through Star Wars data.

 Are you a Power BI Jedi? Do you have the powers to become one, and join the fight for insights? In this hands-on session we will build together a Power BI report that analyzes Star Wars data from a web service. We will create custom functions to iterate over paged results and extract the entire dataset of characters and species. We will apply three different techniques to calculate the body mass index (BMI) of the Star Wars characters, and create smart mashups using Cartesian Product to classify BMI into different categories. Finally, we will apply What-If techniques to explore better BMI calculations for droids (Being made out of metal, doesn’t help your BMI).

When:  2/21/2018

Where: https://www.youtube.com/watch?v=r0Qk5V8dvgg

Gil Raviv is a Microsoft MVP, Analytics Group Manager at Avanade, and a Power BI expert. As a former Senior Program Manager on the Microsoft Excel Product team, Gil led the design and integration of Power Query as the next-generation Get-Data technology in Excel 2016, and became an extreme M practitioner (M=Power Query formula language).

Gil is a highly skilled Software Development & Product Manager with 18 years of experience, and four US patents in the domains of social networks, cyber security, and web. He also held a variety of innovative roles in the Israeli Cyber Security industry, where he was harnessing the power of data analytics and big data to deliver new security products from advanced threat detection and reporting solutions for enterprises, to protection of kids on Facebook.

In his blog DataChant.com, Gil has been evangelizing about Power BI & Power Query since he moved to his new home at Chicagoland, and recently received the Microsoft MVP Award in Data Platform Category. Read more here.

Contact the author at gilra@datachant.com

Find Gil’s MVP Profile or other MVPs here.

TFS 2018.1 RTM is available

$
0
0

Today we released the final build of Team Foundation Server 2018 Update 1.  The key links are:

This release is primarily bug fixes for important issues and a few select features.  The next "big" feature release will be TFS 2018.2 - due in the May timeframe.  I wrote at some length in the 2018.1 RC post on how to think about this release and our update cadence moving forward.  I won't try to repeat all of that here.

See the release notes for details on installing this release.

Please let us know if you have any issues with the release.

Brian

 

 

Azure Service Bus now integrates with Azure Event Grid!

$
0
0

We are happy to announce that Azure Service Bus is now able to send events to Azure Event Grid. The key scenario this feature enables is that Service Bus Queues or Topics / Subscriptions that have a very low volume of messages, do not have to have a receiver polling for messages at all times. Service Bus will now send events to Azure Event Grid when there are messages in a Queue or Topics / Subscriptions if no receivers are present. You can create Azure Event Grid subscriptions for your Service Bus namespaces and listen to these events and react to the events by starting a receiver. With this feature, Service Bus can be used in reactive programming models.

In the current release, this feature is for Premium namespaces only and is only available in regions where Event Grid is available. We will add support for Standard Namespaces at a later point in time.

  • Please find the full technical documentation here.
  • If you want to jump into the examples right away, please follow this link.

Known issues:

  1. Right now Azure Event Grid does not support multiple filters per subscription. This is a feature we are currently working on. To work around this, you can simply create multiple Event Grid Subscriptions.
  2. Additionally Logic Apps right now has no straight forward way of closing the connection once it received, hence Service Bus will emit events only every 10 minutes as it waits for the idle timeout of the connection.
  3. Right now you will need to use a Http Trigger Azure function. The Service Bus enabled Azure Functions do not support Event Grid yet.

Known issue: Package deployment failing for some Tier 2, Tier 3, Tier 4 and Tier 5 sandbox environments deployed in Microsoft subscriptions

$
0
0

A recent change is causing package deployments to fail during the preparation phase when trying to download a package on the BI machine associated with sandbox environments deployed in Microsoft subscriptions. When this error is encountered, the environment state in LCS displays Preparation failed. This issue is currently impacting some, but not all, Tier 2, Tier 3, Tier 4 and Tier 5 sandbox environments. The error is impacting deployments of all package types when applying updates in LCS.

If the error occurs, the following appears in the log files in the BI folder that can be downloaded from LCS:

[2018-02-19 21:55:07] Downloading servicingtools.zip with download link https://test.blob.core.windows.net/servicingpackages/AX/7/servicingtools/1.0.30/servicingtools.zip?sv=2015-04-05&sr=b&sig=W5b0G85Qp8Q7Y1sxJbhqkkBwWa1v%2Bx1%2B9aSd5k0WeQE%3D&se=2018-02-20T21%3A54%3A53Z&sp=r to Server XYZ
[2018-02-19 21:55:07] Failed to download servicingtools.zip to Server 'XYZ'
[2018-02-19 21:55:07] Sleeping for 30 seconds before retrying

We are actively investigating this issue and will update this blog post as soon as we have released a fix.


Finding which TLS version is in use for client connections

$
0
0

How can you tell what version of TLS is currently used for client connections? Simple question we’ve been asked as we prepare to ship the new OLE DB driver.

Starting with SQL Server 2016 SP1, and SQL Server 2012 SP4, the Trace xEvent (Debug channel) exposes the TLS/SSL protocol that's used by the client. If a TLS/SSL negotiation is completed successfully, information such as the TLS/SSL protocol, cipher, hash, and peer address is returned. If the negotiation fails, only the IP address of the client is returned.

image

You can also refer to KB 3191296 for information. Note this will also be available in a future SQL Server 2014 Service Pack.

For more information on configuring SQL Server to use TLS1.2, as well as a list of known issues, refer to KB 3135244.

Pedro Lopes (@sqlpto) – Senior Program Manager

Microsoft Intelligent Security Graph

$
0
0

こんにちはマイクロソフトテクノロジーセンターの上口です。

これから定期的にMicrosoft365セキュリティブログを書かせていただくことになりました。

みなさんどうぞよろしくお願いいたします。

さて、日本では産・官・民の連携により 2月1日 より 3月18日 までの期間をサイバーセキュリティ月間と称し、様々なイベントなども行われていますが( NISC )

もちろんマイクロソフトもセキュリティイベントとして Microsoft Security Forum 2018 を実施し啓発活動を行っております。

今日はその中でも、昨年もいろいろなお客様へお話をさせていただきました『 Intelligent Security Graph 』についてあらためてお話をさせていただきたいと思います。


まずその前に、サイバー攻撃の現状を少しお話させていただきます。

昨今さまざまな分野で被害をもたらすサイバー攻撃。その攻撃手法は日々様々な形で進化し、守る側より攻撃者が勝るとも言われています。

日々作成されるマルウェアは約20万個、ランサムウェアの被害は前年比の6000%ととも言われ、そのうちの50%は100万円、そのうちの20%が400万円以上を

支払ったという調査データもあります。また被害総額は10億円以上とも推定されています。

 

ゼロデイに対応するために

ゼロディ攻撃とはその名の通り、脆弱性が発見されて修正プログラムが提供される日(One day)より前にその脆弱性を攻撃することです。従来の対応では脆弱性が発見されてから

修正プログラムが開発されるため、どうしても対応が遅れたり修正プログラムを適応してもすでに侵入済みという状態となってしまうことは避けられませんでした。

私たちマイクロソフトは世界規模でセキュリティへの投資を年間1,000億円投資をしています。そしてお客様へ安全にご使用いただくために様々なデータを分析しお客様へフィード

バックすることで迅速に対応しゼロデイ攻撃に対応することが可能になりました。

 

マイクロソフト CEO サティア・ナデラ

 

すべてはインテリジェンス

マイクロソフトは世界中の約10億台のWindowsデバイス、企業向け/一般消費者向けのクラウドサービスから得られる毎月4,500億件の認証と4,000億通 のメ ー ルなど、自社製品や

サービスを通じて得られる億単位のデータ(シグナル)とサードパーティの情報を集約し、人工知能(AI)や行動分析を活用しながら、セキュリティ脅威の状況をリアルタイムに

分析し『 Intelligent Security Graph 』をつくりました。マイクロソフトはこれらIntelligent Security Graph をもとにお客様の安全なクラウドサービスをご提供しています。

次回にブログは、Intelligent Security Graph をもとにどのようなことがわかるのか? についてお書きしたいと思います。


[参考情報]

・インテリジェントセキュリティグラフ

https://www.microsoft.com/ja-jp/security/intelligence

・セキュリティインテリジェンスレポート

https://www.microsoft.com/ja-jp/safety/report/sir.aspx

LogReader errors ‘Validating publisher’ after AlwaysOn failover.

$
0
0

Chris Skorlinski
Microsoft SQL Server Escalation Services

Problem of the day: Transactional Replication was configured on a database in AlwaysOn Availability Group. When the group was failed over to synchronized secondary node Transactional Log Reader stopped working.

Initial LogReader errors when failover occurs.  If configured correctly, LogReader will retry and reconnect to new Primary.

2018-02-21 00:08:31.165 Status: 16384, code: 20007, text: 'No replicated transactions are available.'.

2018-02-21 00:09:16.210 Status: 4, code: 22021, text: 'Batch committed.  Batch consisted of 1 commands, 1 xacts.  Last xact: 0x00000037000001e80001, '0x00000037000001e80004'.'.

2018-02-21 00:10:16.318 Status: 2, code: 20011, text: 'The process could not execute 'sp_replcmds' on 'SQLSERVER-0'.'.

2018-02-21 00:10:16.318 The process could not execute 'sp_replcmds' on 'SQLSERVER-0'.

2018-02-21 00:10:16.318 Status: 2, code: 10054, text: 'TCP Provider: An existing connection was forcibly closed by the remote host.'.

2018-02-21 00:10:16.318 Status: 2, code: 10054, text: 'Communication link failure'.


Additional error occurs are reported if LogReader is not configured correctly to redirected LogReader to new Primary.  Error below shows LogReader still trying to connect to sqlserver-0 which is now ReadOnly secondary.

2018-02-21 00:11:57.584 Status: 32768, code: 53044, text: 'Validating publisher'.

2018-02-21 00:11:57.615 Status: 0, code: 20015, text: 'The target database ('AdventureWorks2012') is in an availability group and is currently accessible for connections when the application intent is set to read only. For more information about application intent, see SQL Server Books Online.'.

2018-02-21 00:11:57.615 The target database ('AdventureWorks2012') is in an availability group and is currently accessible for connections when the application intent is set to read only. For more information about application intent, see SQL Server Books Online.

2018-02-21 00:11:57.615 Status: 0, code: 22037, text: 'The target database ('AdventureWorks2012') is in an availability group and is currently accessible for connections when the application intent is set to read only. For more information about application intent, see SQL Server Books Online.'.


I setup this demonstration using Microsoft Azure VMs preconfigured with SQL 2016 AlwaysOn. I added AdventureWorks2012 to availability group then created a standard Transactional Replication publication on 1 table using primary sqlserver-0 as Publisher.

Next, I did a failover to secondary sqlserver-1 causing LogReader to fail with first block of connectivity errors. Had I configured it correctly using link below, the LogReader would have retried, reconnected to new primary and continued to replicate. However, I skipped sp_redirect_publisher steps on purpose causing LogReader failure.  My objective was to see if I could go back through this blog posting while LogReader was in a failed state still trying to connect to old primary (sqlserver-0 secondary and sqlserver-1 now primary).

https://blogs.msdn.microsoft.com/alwaysonpro/2014/01/30/setting-up-replication-on-a-database-that-is-part-of-an-alwayson-availability-group/


On a restart, the LogReader fails with error below showing sqlserver-1, new Primary was not yet configured as publisher. This makes perfect sense, as for this repro I had on purpose not yet added sqlserver-1 as possible publisher. 

Error messages:

The publisher 'SQLSERVER-1' with distributor 'distributor' is not known as a publisher at distributor 'distributor'. Run sp_adddistpublisher at distributor 'distributor' to enable the remote server to host the publishing database 'AdventureWorks2012'. (Source: MSSQL_REPL, Error number: MSSQL_REPL21891)

Get help:
http://help/MSSQL_REPL21891

Errors were logged when validating the redirected publisher. (Source: MSSQL_REPL, Error number: MSSQL_REPL22037)

Get help:
http://help/MSSQL_REPL22037


On the Distributor, under Distributor Properties, Publisher I added sqlserver-1 as a Publisher.

clip_image002


I verified on the Distributor, now shows linked servers to both Publishers.

image

Following the blog posting step 3) I added my remote distributor on sqlserver-1.  I had to specify the replication administrator password created when I first setup Replication. If I forgot it, I could change it on the Distributor Properties, but I’d have to update original publisher with new password. Thankfully I knew the replication admin password, I should, I just set it up 10 minutes ago.

3) Configure Remote distribution on possible publishers (with screen shots)
For the possible publishers and secondary replicas: SRV2 and SRV3, we'll have to configure the distribution as a remote distribution that we created on SRV1.
•Launch SQL Server Management Studio. Using Object Explorer, connect to SRV2 and right click the Replication tab and choose Configure Distribution. Choose 'Use the following server as the Distributor' and click Add. Select SRV4 as the distributor.

I did step 4 adding linked server to Subscriber. I tried using Wizard, got error, so I just ran script in blog posting step 4.

EXEC sys.sp_addlinkedserver @server = 'subscriberserver’;

Finally, I ran step 5 in blog against the Distributor updating the MSredirected_publishers table with AlwaysOn Listener name AWListener. Note, only specify name of Primary server where publication was created, not current Primary node name. For me this was sqlserver-0.

USE distribution;

GO

EXEC sys.sp_redirect_publisher

@original_publisher = 'sqlserver-0',

@publisher_db = 'AdventureWorks2012',

@redirected_publisher = 'AWListener';


I verified distributor was updated correctly.

SELECT TOP (1000) [original_publisher]
       ,[publisher_db]
       ,[redirected_publisher]
   FROM [distribution].[dbo].[MSredirected_publishers]


original_publisher       publisher_db                 redirected_publisher

------------------------ ---------------------------- ----------------------------------------

sqlserver-0              AdventureWorks2012           AWListener

SUCCESS! I was now able to restart the LogReader while sqlserver-1 (original secondary) is now primary, no failover back to original Primary sqlserver-0 was needed. LogReader, running on my Distributor, connected to new primary and started replicating pending transactions.

Office 365 in the Classroom

$
0
0

Our ongoing commitment to create more inclusive and collaborative learning environments continues today with the arrival of a number of powerful updates to Office 365 Education. Building on the momentum of OneNote Class Notebook, which has surpassed more than 15 million notebooks created since the beginning of this school year, and the launch of Microsoft Teams for Education, we have worked with students, teachers, and research institutes to ensure that Office 365 continues to deliver the best learning outcomes.

Today’s updates to Office 365 Education include enabling students to write a paper using only their voice, thanks to Dictation in Office, the latest feature to join Microsoft Learning Tools, as well as improved access to assignments and class collaboration with the Microsoft Teams iPhone and Android apps. What’s more, we are delivering on the number one most requested feature from teachers, page locking in OneNote Class Notebook—allowing teachers to provide students with read-only access—and we will further help teachers save time through assignment and grade integrations with leading Learning Management Systems (LMS) and Student Information Systems (SIS), including Capita SIMS in the U.K. Office 365 Education gives teachers and students the power to unlock limitless learning. And best of all, it’s free for teachers and students.

Inclusive Learning Tools for better student results

Classrooms are diverse. Seventy-three percent of classrooms have students with reading levels that span four or more grades, and up to 50 percent of instructional time can be lost managing the students’ varying needs. Learning Tools is proven to help, and we are humbled by the results. In a recent study, it was shown to increase reading speed and comprehension for students of all abilities, leading to test scores that were 10 percent higher than students who did not use Learning Tools.

We are incredibly excited to see strong adoption of Learning Tools with more than 7 million monthly active users across Word, OneNote, Outlook, Edge, and Office Lens. Based on feedback from teachers, students, and parents, we have been working on extending the capabilities of Learning Tools even further—and today we are excited to announce several updates.

  • Dictation in Office—This simple yet transformational tool will help students of all abilities to write freely by using only their voice. Starting in February, Dictation will be available in Word, Word Online, PowerPoint, Outlook Desktop, OneNote Windows 10, and OneNote Online—and in more than nine languages.

Animated screenshot displays Dictation in Office.

  • Read Aloud—Allows students to hear the contents of an email while each word is highlighted in sync. It will soon be available on Outlook Desktop in more than 30 text-to-speech languages.
  • Immersive Reader—To further support students of different backgrounds, Immersive Reader now supports an additional 10 new languages. It is also coming to even more platforms in 2018 and will soon be available on Word for Mac, iPhone, and Android, as well as Outlook Desktop and OneNote for iPhone, iPad, and Mac.

Screenshot dipslays Immersive Reader in Microsoft Word.

New features for teaching and learning in Microsoft Teams

Since delivering Microsoft Teams for Office 365 Education last year, we’ve seen educators around the globe boost collaboration and learning outcomes using Teams as their digital hub. In addition to our recent announcement, we’re releasing a number of features that will make setting up, collaborating, and managing a classroom in Teams easier than ever before.

  • Assignments support—Using the Teams app on their iOS and Android mobile phone or tablet, students can now access upcoming assignments, receive new assignment notifications, and turn in their work. Teachers can create new assignments, as well as review and make edits to existing assignments all while on the go. We also improved the search function on mobile to ensure both students and teachers can quickly find and navigate to individual assignments.

iPhone displays Assignments accessed in Microsoft Teams.

  • On Demand Translation—Students and teachers will soon be able to turn content in a chat or in a team channel into the language that their tenant is configured in. This powerful feature allows teachers and students to converse comfortably in their chosen language and removes all language barriers.
  • Assignment Analytics—Now teachers can track assignment engagement in real time—at a glance—to see who’s viewed and turned in their work.
  • Join Codes—Saves teachers valuable time by allowing them to simply invite students to a class. This capability will also prove helpful for staff and PLC teams, ensuring an effortless start to collaborating with co-workers.
  • Reusing a Team as a Template—Teachers can reuse an existing team as a template when creating a new team and can customize what they want to copy over—from channels, tabs, team setting, apps, and even users.
  • Decimal Grading—Teachers can provide grading feedback in their preferred way using Decimal Grading.

New OneNote features for teachers

OneNote is also receiving helpful new features, which we know are going to prove popular among teachers.

  • Capita SIMS—Updates to Class Notebook include assignment and grade integration with more than 35 of the most widely used LMS and SIS, including Capita SIMS in the U.K. These integrations are coming to OneNote for Windows 10, OneNote Online, and OneNote iPad, and will reduce administrative burden and save teachers time.
  • Page Locking—To further simplify classroom workflows, we are delivering on the number-one request from teachers for Class Notebooks—enabling lock pages. Teachers can now lock pages as read-only after giving feedback to the student.
  • Interactive math calculators—In OneNote, we are also enabling Desmos interactive math calculators, a set of popular applications for STEM teachers.
  • New stickers—We also added four new fun sticker packs: Feathered Friends, Science, Circus Animals, and Arrows.

Screenshot displays a graded assignment in OneNote, upon which a sticker and the teacher's praise "great work!" appear.

Teams integration with PowerPoint and Microsoft Stream

Our improvements to class collaboration don’t stop here. Teams has also joined forces with PowerPoint and Microsoft Stream to make it easy for teachers and students to create and share interactive content in just a few steps.

A teacher can use PowerPoint to build immersive class content (that includes ink, animations, and audio/video narrations), publish it to their Stream channel as a video, and have it surface in their Teams class to distribute to their students. Furthermore, Stream will also add automatic captioning to the videos to make them accessible to all learners.

Animated screenshot displays a Microsoft Stream video published to PowerPoint.

We believe that through inclusive and collaborative learning environments, Office 365 Education is built to support every type of learner to empower them to do their best work. To learn more about the exciting updates coming to Microsoft Education, check out the Windows blog.

C#을 이용한 Azure Text Analytics API / Sentiment Analysis 예제

$
0
0

Azure 가 제공하는 Cognitive Service 중에 Text Analytics API 는 텍스트에 포함된 감정을 추측할 수 있는 독특한 API를 제공해준다. 예를 들어, “I had a wonderful experience! The rooms were wonderful and the staff was helpful.” 라는 문장을 보면 해당 표현이 긍정적임을 직관적으로 이해할 수 있는 데, 해당 문장에 대한 sentiment에 대한 분석을 해당 API를 통해 요청한다면, 0부터 1사이의 값으로 긍정적인지 부정적인지를 답해준다. 상위의 예에 해당하는 결과는 0.96으로 매우 긍정적이라고 알려준다 (여러 가지 언어를 제공하지만, 한글은 아직 지원하지 않고 있다).

이런 시나리오를 생각해 본적이 있는가? 예를 들어 우리가 흔히 접하는 일부 뉴스들에 대해서 얼마나 부정적인 기사들이 많이 언급되는 가를 Sentiment Analysis API로 분석한다면 어떨까? 다음은 그와 같은 예제를 어떻게 구현할 수 있을지 소개하고자 한다.

Azure subscription이 있다면, Azure Portal(http://portal.azure.com)에 들어가서 New > AI+ Cognitive Services 를 클릭하면, Text Analytics API를 생성할 수 있다. 리소스가 생성이 되면, resource management에서 Keys 를 확인할 수 있는 데, 해당 key를 일단 확보해 둔다. 이는 API를 호출할 때 필요하다. 대부분의 cognitive service 는 이와 유사하게 리소스를 만들고, key를 확보한 후에 API 호출 시 HTTP header에 해당 key를 포함하여 전송함으로써 응답을 받을 수 있다.

코드는 아래와 같은 pattern 이다.

var uri = "https://eastasia.api.cognitive.microsoft.com/text/analytics/v2.0/sentiment";
var json = "{"documents": [{"id": "1","text": "" + node._summary + ""}]}";
 
var client = new HttpClient();
 
client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", "1234567890abcdefghijklmnopqrstuv");  // 32자리 key
 
HttpResponseMessage response;
 
byte[] byteData = Encoding.UTF8.GetBytes(json);
 
using (var content = new ByteArrayContent(byteData))
{
    content.Headers.ContentType = new MediaTypeHeaderValue("application/json");
    using (response = await client.PostAsync(uri, content))
    {
 
        var result = await response.Content.ReadAsStringAsync();
 
        var body = JsonConvert.DeserializeObject<SentimentBatchResult>(result);
        if (body != null)
        {
            if (body.Documents != null)
            {
                foreach (var document in body.Documents)
                {
                    Console.WriteLine("Subject: {0}, Sentiment Score: {1:0.00}",
                        node._subject.ToString(), document.Score);
                }
            }
        }
    }
}

몇 가지 언급하자면, Ocp-Apim-Subscription-Key 는 미리 언급했던 azure portal에서 얻어지 key값을 header에 넣어 전송하기 위해 필요한 항목이다. node._summary 는 뉴스 기사의 요약본 문자열이고, Sentiment API를 통해서 긍부정 여부를 확인할 것이다. Sentiment API를 통해서 전달받은 결과는 json 형태로 전달받을 것이고, deserialize를 위해서 사용하는 SentimentBatchResult type은 nuget package를 통해서 Microsoft.Azure.CognitiveServices.Language preview 버전을 설치하여 이용하였다.

테스트를 위한 뉴스 요약본은 다음과 같이 간단하게 얻어서 테스트할 수 있다.

using (WebClient client = new WebClient())
{
    string data = client.DownloadString(_url);
 
    Rss20FeedFormatter rss = new Rss20FeedFormatter();
    rss.ReadFrom(XmlReader.Create(new StringReader(data)));
 
    SyndicationFeed feed = rss.Feed;
 
    foreach (SyndicationItem item in feed.Items)
    {
        string subject = item.Title.Text;
 
        if (item.Summary != null)
        {
 
            string t = item.Summary.Text;
            t = Regex.Replace(t, @"<.+?>", String.Empty);
            t = WebUtility.HtmlDecode(t);
 
            //Console.WriteLine($"{t}");
            _list.Add(new ListNode(subject, t));
        }
    }
}

예를 들어, _url 은 https://news.google.com/news/rss/search/section/q/Korean%2BBusiness%2Banalysis/Korean%2BBusiness%2Banalysis?hl=en&gl=US&ned=u
와 같이 입력하여 Korean business analysis 라는 키워드로 검색한 영어 기사를 RSS로 간단히 받았으며, 뉴스 요약본을 추출하여 Sentiment API로 긍부정을 확인하기 위한 텍스트로 사용하였다.

테스트를 해보면, 기대한 만큼 분석 결과가 유용하지 않을 수도 있다. 아래와 같이 뉴스 subject과 score값을 출력하여 해당 뉴스의 요약본이 나타내고 있는 긍부정을 확인할 수 있다. 대부분이 0.5이고 3개정도의 기사만 긍정적이라고 표시하고 부정적인 기사는 없는 것으로 보이는 데, 뉴스의 텍스트가 대부분 팩트의 전달을 목적으로 하다 보니 감정적으로 건조해서 그럴 수도 있겠다 싶다.

Viewing all 29128 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>