Quantcast
Channel: MSDN Blogs
Viewing all 29128 articles
Browse latest View live

Episode 8: IT IQ Series – Predictive analytics: the “cheat-sheet” for more effective teaching?

$
0
0

Summary: By detecting learning difficulties before it’s too late, predictive analytics can help teachers gain a far deeper understanding of their students–and how to solve their issues.

Can technology help teachers get the best out of their students? With Australia’s education system struggling to compete globally, many teachers face growing pressure to do more, with less, for an ever-growing number of students under their care. That’s especially true for low socioeconomic zones where only 60% of students finish school, and where teachers face the greatest shortage of resources. These teachers, however, have a new and promising tool at their disposal: data analytics.

“Educators already know that factors such as learning styles and absorption rates impact how effectively students end up performing. But it’s the lack of visibility toward complicated socio-economic or geographic factors that frustrates educators,” says Peter Manoukian, an Account Technology Strategist at Microsoft. “Right now, our approach to these issues is purely reactive: we can only intervene when performance starts to slide. But when you apply analytics to school and classroom data, you start to see the warning signs of academic decline before it happens. That gives teachers a valuable window of opportunity to make a change.”

Predictive analytics platforms have already been used to great effect by different industries to understand consumer behaviour and improve decision-making. With the right data, the same could apply to Australia’s classrooms.

“Schools already have a wealth of historical student data that can be immediately fed into a predictive analytics platform to create reports on where slippages and dropouts have occurred in the past, and more importantly where slippage may occur in the future” explains Manoukian. “Educators want an intuitive, easy method to analyse this student data to confirm their daily observations of students, and to use that confirmation to justify taking action to help declining students.”

From Classroom to Community

Educators can only do so, however, if their analytics tools give them clear answers without requiring overly technical know-how. Some analytics tools in the market, like Microsoft Power BI, already offer analytics-made-easy without any coding knowledge required, helping educators dive in and discover ways to help their students as quickly as possible. Teachers will benefit the most from platforms which present their data and insights through dashboards, charts, and other visual forms of reporting that they–and their principals and other superiors–can quickly understand and act upon.

“Educators can point Power BI to existing Excel worksheets for the subject that they teach, type questions into the tool (using natural language), and find any trends or insights within the data,” says Manoukian. “They can also combine data at different levels, from the student to the entire school: that lets them get a bigger picture of where a particular student might fit within the broader trend, then build a lesson plan that will address their weaknesses or improve their strengths in class. But the more complex these data sets get, the more important visual communication becomes to getting everyone, from teachers to principals, on board with the needed changes.”

But what about long-term challenges such as declining grades or high attrition rates? Australian schools that employ predictive analytics would be in a better position to take proactive action, instead of resorting to a passive ‘wait-and-see’ method. That proactive position starts, says Manoukian, with assessing students’ broader circumstances.

“Through our work with schools around Australia, the Azure team has ended up handling a range of data sets including nutrition, socio-economic l and even social data”  Manoukian explains. “When we feed these seemingly unrelated data sets into a platform like Azure Machine Learning, we can predict various potential problem sites with a reasonably high degree of accuracy. That, in turn, helps schools deploy their resources as well as work with other parties, like government agencies, to remove underlying roadblocks to students’ learning.”

That said, schools can’t treat analytics as a ‘fire and forget’ solution. Teachers and community leaders must assess the actual impact of their policies or processes, then use the results as feedback to further improve the analytics models behind their decision-making. In that regard, analytics and students’ results have one thing in common: you only get out what you put in.

“When working with analytics and machine learning, you’ll need to continuously pump in often a wide range of data to train the machine and allow it to make better predictions,” cautions Manoukian. “The earlier Australian schools start, the longer they’ll have to accumulate that data, and the more effective their efforts to improve students’ performance over time. Change may not come overnight, but the results for our most vulnerable and at-risk students will be well worth the investment.”

Watch Peter Manoukian speak further on the role of Education Analytics and how it can help Australian schools  on our YouTube channel.

Get started on Microsoft Power BI, and learn how to start using the tool to help your class on the Microsoft Virtual Academy.To get started on Azure and Azure Machine Learning, by connecting with our locally certified consultants today.

Our mission at Microsoft is to equip and empower educators to shape and assure the success of every student. Any teacher can join our effort with free Office 365 Education, find affordable Windows devices and connect with others on the Educator Community for free training and classroom resources. Follow us on Facebook and Twitter for our latest updates.


Announcing the deprecation of the WIT and Test Client OM at Jan 1, 2020

$
0
0

Since the first version of Team Foundation Server (TFS) in 2005, we have provided a set of SOAP APIs for programmatic interaction with Work Items and Tests. In recent years, REST has replaced SOAP as the preferred method for building integrations offering a simpler and more flexible programming model, support for multiple data formats, and superior performance and scalability. As our REST APIs have matured we've reached a point where we feel it's time to announce the deprecation of the Work Item and Test SOAP APIs. The plan is as follows:

  • TFS "2020" (shipping towards the end of the 2019 calendar year) will remove support for the WIT and Test SOAP APIs.
  • VSTS support will be removed on Jan 1, 2020.

If you are a consumer of these SOAP APIs we recommend to plan for a migration to REST. The table below outlines a support matrix of what you can expect across both SOAP and REST.

TFS

TFS 2018 TFS "2019"1 TFS "2020"2
and later
SOAP WIT Yes Yes No
Test Yes Yes No
REST WIT Yes Yes Yes
Test Partial Yes Yes

1 Code name from the TFS version which is expected end of 2018
2 Code name from the TFS version which is expected end of 2019

VSTS

Before Jan 1 2020 After Jan 1 2020
SOAP WIT Yes No
Test Yes No
REST WIT Yes Yes
Test Yes Yes

Additional resources

If you need help, or want to provide feedback, please reach out to this support alias.

FAQ

Q: What happens to the Version Control SDK and its SOAP APIs?
A: We will keep supporting the Version Control SDK, but we will end-of-life the SDK at some point in the future. If you create a new application, we recommend you start with the REST APIs. If you have existing apps using the SOAP SDK, we recommend you migrate to the REST APIs.

Q: The SDK included an easy-to-use C# client library. Does REST API have an equivalent?
A: Yes, the REST APIs have a .NET library that abstract out the need to call REST APIs directly.

Q: I'm using TFS2017 or an earlier version. Can I migrate to REST APIs?
A: No, older versions of TFS don't include the full set of Work Items and Test REST APIs

Securing Azure Storage for the Software Architect

$
0
0

In this post, App Dev Manager Keith Anderson explores how Azure Key Vault can be used to secure access to Azure storage operations.


In the beginning...

As I began to learn about Microsoft’s cloud a couple of years ago, I began to realize that some services had been around since its inception and were fundamental to the way things worked, with other services built on top. Azure Storage is certainly one of those fundamental services that can be used as building blocks in your own applications. To any organization leveraging Azure Storage, it seems to me that, like with any tool or construction material, you can make mistakes if you aren’t careful, and those mistakes can potentially be dangerous to whatever you are creating or doing.

What is a software architect anyway?

I was once hired to be the first software architect in an organization where I had already been working as a developer. Prior to that point, the role did not formally exist in my organization, though many of the architectural concerns where handled in a decentralized way, by one or more individuals throughout my tenure there. After I accepted the role, I had to help define the role. After thinking about it for a while and doing a bit of research, I settled on a definition that centered around making high level decisions about technologies and tools so that developers could focus on implementation without having to reinvent the wheel, such as deciding which language or RDBMS to use. It also meant controlling and standardizing the way those technologies and tools could be used. Taken to its logical conclusion, to me this meant owning most if not all of the cross-cutting concerns, including database access, identity and security (AuthN/AuthZ), and SOA-level event publication/subscription.

Storage as a cross-cutting concern

Azure Storage is a tool that fits into the definition of one of these cross-cutting concerns. You could leave it up to the developer how they use it, but in doing so you would leave your organization open to security flaws that could potentially be costly and dangerous. Case in point: If you have ever downloaded and looked at one of the storage code samples, you would see something like the following:

CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));

// Create the CloudBlobClient that is used to call the Blob Service for that storage account.

CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();

// Or Create the TableBlobClient or Queue or whatever

CloudTableClient tableClient = storageAccount.CreateCloudTableClient();

Then elsewhere in the documentation, you will find a warning to never share your storage key, which is exactly what you are getting from CloudConfigurationManager.GetSetting("StorageConnectionString"). We tell you this over and over again. Never share your keys and secrets. So, will your developers know not to do this after downloading the samples or will they just do what they need to do to get it working?

What is the danger?

So, what’s the big deal anyway? Why shouldn’t you pass around your storage key to all of your applications and developers? The typical answer is that if it gets into the hands of someone you don’t trust, you’ll have to change it. Then you’ll have to update every single application configured to use it. Sometimes this means deploying the entire application, or at least updating the configuration. The other danger is that with the access key, you can do anything, including delete all of your data. There isn’t any concept of Role Based Access Control and granting permissions for only what you need. So, how do you protect your data and give developers and applications access to do only what they need to do and only when they need to do it?

Can Shared Access Signatures help?

One answer may be to create a SAS key, make sure it has just enough rights to perform the action required, and give it an expiration so that it can’t be used indefinitely. SAS Keys can be used to instantiate Cloud{type}Clients just like a CloudStorageAccount object.

// creating a shared access policy that expires in 30 minutes.

// No start time is specified, which means that the token is valid immediately.

// The policy specifies full permissions.

SharedAccessTablePolicy policy = new SharedAccessTablePolicy(){

SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(

SasProducer.AccessPolicyDurationInMinutes),

Permissions = SharedAccessTablePermissions.Add

| SharedAccessTablePermissions.Query

| SharedAccessTablePermissions.Update

| SharedAccessTablePermissions.Delete

};

// Generate the SAS token. No access policy identifier is used which

// makes it a non-revocable token

// limiting the table SAS access to only the request customer's id

string sasToken = cloudTable.GetSharedAccessSignature(

policy /* access policy */,

null /* access policy identifier */,

customerId /* start partition key */,

null /* start row key */,

customerId /* end partition key */,

null /* end row key */);

string sasToken = this.addressBookService.RequestSasToken(this.customerId);

// Create credentials using the new token.

StorageCredentials credentials = new StorageCredentialsSharedAccessSignature(sasToken);

CloudTableClient tableClient = new CloudTableClient(tableEndpoint, credentials);

Wait a minute though. That cloudTable object we’re using to call GetSharedAccessSignature() came from somewhere. In order to instantiate a CloudTable object, you need the storage access key, so it seems we’re back to square one with all of our applications needing to know our storage access key.

Isolation and control behind a service

Well, not quite. What if we cut the code in half and isolate the part to create the SAS token behind a service? That way, we share our Storage Access Key with only one application, and that service is responsible for producing and distributing SAS tokens to all of your other clients. The access to storage becomes a maintainable cross-cutting concern service that can be maintained and controlled by the architect and used by the rest of either the organization in an enterprise scenario, or products in a software-as-a-service scenario, as a building block component.

That is exactly the best practice written about in detail by the storage account team back in 2012 in this excellent blog post. Follow this blog to create your very own storage service.

https://blogs.msdn.microsoft.com/windowsazurestorage/2012/06/12/introducing-table-sas-shared-access-signature-queue-sas-and-update-to-blob-sas/

Once implemented, if your key is ever compromised or rotated according to your maintenance schedule, you only need to change and redeploy in one place. You also limit the attack surface for your key ever being compromised in the first place.

Can Key Vault help?

Yes! The storage account keys feature of Key Vault is now in public preview. This service can do many of the things discussed above and a lot more.

https://docs.microsoft.com/en-us/azure/key-vault/key-vault-ovw-storage-keys

// Create KeyVaultClient with vault credentials

var kv = new KeyVaultClient(new KeyVaultClient.AuthenticationCallback(securityToken));

// Get a SAS token for our storage from Key Vault

var sasToken = await kv.GetSecretAsync("SecretUri");

// Create new storage credentials using the SAS token.

var accountSasCredential = new StorageCredentials(sasToken.Value);

// Use the storage credentials and the Blob storage endpoint to create a new Blob service client.

var accountWithSas = new CloudStorageAccount(accountSasCredential, new Uri ("https://myaccount.blob.core.windows.net/"), null, null, null);

var blobClientWithSas = accountWithSas.CreateCloudBlobClient();

// Use the blobClientWithSas

...

// If your SAS token is about to expire, get the SAS Token again from Key Vault and update it.

sasToken = await kv.GetSecretAsync("SecretUri");

accountSasCredential.UpdateSASToken(sasToken);

In this scenario, Key Vault becomes your protective service, brokering access to your storage services, and managing your key vault keys as well, never exposing them to clients and rotating them regularly, as a matter of best practice.

The actual secret stored in Key Vault is an account SAS URI that can be used to generate the various storage client objects. As such, it cannot be used with SAS policies at this time. Stored Access Policies are defined on resource containers, rather than at the account level, and signatures based on a policy can be revoked when the policy is revoked. You cannot do that with ad-hoc account SAS at this time. If you want the added security of being able to revoke access once given and before expiration, you should create your own storage access service.

Wrapup

Like most Azure services, it is easy to get up and running, but understanding how to operationalize it in a production environment is more complex. You have to take into account the entire landscape of scale and security and the storage service is no different. Azure storage is an extremely useful tool and building block for a myriad of uses. Getting security right for it is something any architect will want to devote some thought and energy towards. The key takeaway from this should be, limit exposure of your storage access key and the best way to do this is to protect it behind a service.

 


Premier Support for Developers provides strategic technology guidance, critical support coverage, and a range of essential services to help teams optimize development lifecycles and improve software quality.  Contact your Application Development Manager (ADM) or email us to learn more about what we can do for you.

Finding the Correct Permissions for a Microsoft or Azure Active Directory Graph Call

$
0
0

Introduction

This post is to help define how one can find out which permissions are needed for a specific Graph API call.

Assuming you want to have granular control over each AAD Application Registration, having the exact permissions required to do exactly what you need helps to secure your environment from users abusing permissions that you may have granted in excess.

Whenever someone wants to utilize the Microsoft or AAD Graph API, they have to grant the correct permissions for the AAD Application Registrations properly in order to be able to utilize the call.

 

Also for more information between the two Graph APIs please look here : https://blogs.msdn.microsoft.com/aadgraphteam/2016/07/08/microsoft-graph-or-azure-ad-graph/

 

Note: that just because you have been given the ability to make a call, doesn’t mean that the call will be authorized and go through properly. I.e. for o365 calendar issues, if you are logged in as user1 and retrieved a delegated access token and try to access the calendar of user2 and you don’t have access to user2s calendar as user1, you won’t be able to access the calendar. Just because you have granted permissions to access calendars, doesn’t mean that your user can access a calendar that he/she doesn’t have access to in the first place.

 

Finding Which Permissions We Need for a Microsoft Graph Call

Finding the permissions for the Microsoft Graph API is easier because there is a direct mapping for each Microsoft Graph API call described on each Microsoft Graph API call.

 

To determine which permissions we are going to want, you will have to check the permissions at the top of the reference guide for an operation for the Microsoft Graph API.

For example :

I would like to be able to list some users from Azure using the Microsoft Graph API using a client credentials grant type. In order to do this we can check the reference guide for the Microsoft Graph API here :https://developer.microsoft.com/en-us/graph/docs/concepts/v1-overview

On the side bar we can see the reference guide for the Microsoft Graph V1 endpoint, since we would like to list users, I look in the users sections and see if there is a list users.

image

It seems that there is a list users call, so I click on that. At the top of the list users API documentation, we can see the permissions that are needed :

image

Depending on how you plan on getting your access token you will either pick the application permission or the delegated permission, the differences between application and delegated are described here : https://docs.microsoft.com/en-us/azure/active-directory/application-dev-delegated-and-app-perms.

 

With client credentials we will need to utilize the application permissions, the delegated permissions can be used for the code grant type, or a flow that uses a user in addition to login.

For more information on permissions you can go to the permissions page for Graph API here : https://developer.microsoft.com/en-us/graph/docs/concepts/permissions_reference

In this blog post we will utilize the list users functionality, and we will want the permission to have user.read.all (the least privileged) so that we can list users from the AAD Application registration.

 

Now, in the AAD Application registration, it defaults to applications listed under “my apps” which means applications you are considered owner of. This is not going to be related to getting an access token to the Microsoft Graph API, so we are going to disregard not being the owner of the application. Please press the box that says view all applications to get access to your app shown in the picture below.

You can also change the box that says my apps to all apps to show all the applications in your tenant.

image

image

From here we will click on the application that you created, we will want to go to the required permissions blade described in the picture below :

image

After pressing the “Add” button, we will want to add the permission for the Microsoft Graph shown below.

image

image

Notice how the permissions have descriptions, and not the exact claim. Since we want to get the user.read.all permission we will want to determine which one of these permissions will give the claim user.read.all. We can find this in the permissions reference, for users we can find it here : https://developer.microsoft.com/en-us/graph/docs/concepts/permissions_reference#user-permissions

image

Notice this is under the delegated permissions, if we scroll down a bit more we will find the application permissions section.

image

We will want to find the corresponding display string in our permissions list “read all users’ full profiles” and check the button and press the done button.

image

After adding the user.read.all permission for the Graph API we will need to press the grant permissions button because this permission requires admin consent. You can see if it requires admin consent by seeing if it says yes under the admin consent column.

 

Below shows where the grant permission button is next to the Add button in the portal. This will need a tenant global admin to grant admin consent for it.

image

 

We have now gone through the process to make a Microsoft Graph Call and should be able to connect to the AAD Application Registration now and make the call to the list users functionality. Remember that whether you pick application permission or delegated permission will determine whether or not you get an access token with the correct scopes depending on your grant type flow.

 

Finding Which Permissions We Need for an AAD Graph API Call

Finding the exact permission you need for the AAD Graph API calls is a bit tricky. This is because there isn’t a direct mapping between permissions and AAD Graph API calls. By this I mean you won’t be able to see a call and be able to read exactly which permission it needs in order to access the call.

 

You will have to look through the permissions/scopes for the AAD Graph API and determine what the call you would like to use is trying to do and draw comparisons to the permissions described in the permissions page : https://msdn.microsoft.com/en-us/library/azure/ad/graph/howto/azure-ad-graph-api-permission-scopes

 

For example, let’s say we would like to get a user’s manager described as this AAD Graph API Call :

https://msdn.microsoft.com/en-us/library/azure/ad/graph/api/signed-in-user-operations#GetMyManagerObject

 

We will have to read the call, and see what exactly we are doing first of all. To get a manager object, as described in “Operations on Navigation Properties”:

 

Relationships between a user and other objects in the directory such as the user's manager, direct group memberships, and direct reports are exposed through navigation properties. When you use the me alias, you can read and, in some cases, modify these relationships by targeting these navigation properties in your requests.

 

So we are trying to access the “Navigation Properties” of a user object. The manager endpoint retrieves the sign-in user’s manager object from the manager navigation property. This means that we are going to want to get the permission that allows us to read navigation properties of users. Now going back to the permission scopes page, we can look and try to determine which permissions we will need in order to read navigation properties. Luckily this scenario is listed under “Permission Scope Scenarios”

 

But imagining that it doesn’t fit under the permission scope scenarios, you can look at the permission scope details.

The permissions scope details defines all the permissions for windows azure active directory. Notice how user.read’s description says it cannot read navigation properties. Since the manager object is a navigation property, we are going to have to look at the user.read.all permission which is a superset of permissions of user.readbasic.all. The extra permissions it user.read.all has however as said in the description is that it is allowed to read navigation properties which is the manager object’s property.

 

Thus after giving an AAD Application registration the User.Read.All command we will be able to utilize the getmymanagerobject AAD Graph API call in order to get the Manager Object.

 

Essentially we will be going through the scenarios and then the permission scope details to determine the permissions we will need in order to call the AAD Graph API.

 

Conclusion

We have now gone through an example process of finding the permissions for both the Microsoft Graph API and the Azure Active Directory Graph API. There are some minute differences between the two Graph APIs, and require a different process to determine which permissions should be allowed in order to make a graph api call.

 

If you experience anymore issues please file a support ticket and one of our support engineers will reach out to help you in regards to this issue.

Using Postman to Call the Microsoft Graph API Using Client Credentials

$
0
0

Introduction

This blog post is to help users stand up an Azure Active Directory Application Registration, explain what permissions will be needed to added to the AAD Application Registration, how to make rest calls to get an access token, and then finally call the Microsoft Graph API to do whatever call you would like to do.

 

Please note, that not all permissions are going to be within Azure. i.e. If you are making a Microsoft Graph call to Outlook and would like access to a calendar using a delegated permission, and the user making the call doesn’t have access to said calendar, the call still will not work as the user does not have access to the calendar.

 

Also Please note that there are two different versions of the Graph API, the Azure Active Directory Graph API and the Microsoft Graph API.

In addition to that, note that the Microsoft Graph API has two endpoints right now, the beta endpoint and the V1 endpoint, do not confuse this with the v1 and v2 endpoints for Azure portal as the azure v1 and v2 endpoints are not related to the Microsoft Graph’s v1 and beta endpoints.

 

Setting Up the AAD Application

The first step to getting access to the Microsoft Graph REST API is to setup an AAD Application Registration.

First we are going to want to create the AAD Application registrations in the portal. For this we will need to configure the application to be able to work with Postman so that we can make the call to the Microsoft Graph API. First we go to the Azure Active Directory Blade, go to App Registrations, and then create a new application registration.

2018-03-31 19_16_00-Create - Microsoft Azure - Internet Explorer

From there we are going to want to create a web app with any name. Here I have set the name as web app, and set the Sign-On URL as the callback for Postman: https://www.getpostman.com/oauth2/callback, it is unimportant what the callback url is, but for our case we are using that callback URL for consistency sake. Just make sure that the callback you put down for your AAD Application Registration is the same in your postman call. This is unimportant for the grant type client credentials but more important for other grant type flows.

image

You will have to click out of the sign-on URL to make it check whether or not if it’s correct.

After that we have created our web app, we will want to create a secret. Please keep track of the secret as you won’t be able to see the secret again. You will have to press save in order for the secret to generate.

 

image

With this information in hand, we will be able to move forward and connect to this AAD registration. But without the correct permissions we won’t be able to get an access token to make calls to the Microsoft Graph API.

 

Finding Which Permissions We Need for a Microsoft Graph Call

Assuming we would like to have granular control on what the AAD Application registration has access to and what it doesn’t have access to. We are going to want to make sure that the AAD Application registration only has the permissions it needs to make the MSFT Graph API calls that we are wanting to make.

There has been a separate blog post on finding the correct permissions for your graph API call listed below :

https://blogs.msdn.microsoft.com/aaddevsup/2018/05/21/finding-the-correct-permissions-for-a-microsoft-or-azure-active-directory-graph-call/

 

For this client credentials flow, we will want to set the required permission for Read all users' full profiles under Application Permissions. If you check the delegated permission you won't get the correct permissions because the client credentials flow only gets the application permissions. This permission is shown below.

 

image

 

 

Accessing the AAD App Registration and Calling the Microsoft Graph API

 

Now lets get a secret for our AAD Application registration by going to the keys blade and adding a secret. Please make sure to keep track of this key as you will not be able to retrieve it again once you leave the blade.

 

image

 

Be sure to press the save button in order to see the value after putting in the description for your key.

 

Now our AAD Application registration is ready to go and we can utilize postman to get an access token using the AAD Application registration to use the list users functionality in the Microsoft Graph API.

 

Please keep track of the client id as well which is the application id for your app registration. This is shown below.

image

 

We are going to want to get our token and authorize endpoints, we can find this next to the new application registration button in the App Registrations blade shown in the picture below.

 

image

 

Now we can go to our postman to try to get an Access token. Below is a screenshot of how the postman should be setup with the variables we got from the steps above.

 

We will use the token endpoint as the URL to post, we will then add the client id, client secret, resource, and grant type in the body of x-www-form-urlencoded fields.

 

The only item that isn’t gotten in the steps above is the resource which is different depending on what you’re trying to access. Here we are trying to access the Microsoft Graph which is https://graph.microsoft.com you will need to look through your respective documentation to find what the resource url is for the resource you’re trying to utilize if it’s not the Microsoft Graph.

 

image

 

After this we have now received an Access token back from the token endpoint for your Azure Active Directory tenant. With this access token we can create a new request with a single authorization header and the list user calls to the Microsoft Graph API.

 

image

 

We to have a header with the value as "Bearer <access token>" with the key “Authorization”.

 

image

 

We have now gotten the list of users from the tenant that the AAD Application Registration.

 

 

Conclusion

After going through the steps to create an AAD Application registration, finding what call we want to make, finding what permissions we need to make the call and then getting all the configurations we need to get an access token from the token endpoint. We were able to get an access token from the token endpoint and then make a call to the Microsoft Graph API with our access token to list the users in our tenant. I hope that you can utilize this flow to make the other calls to the Microsoft Graph API. If you have any issues feel free to open a support ticket and one of our support engineers will reach out to you as soon as possible, please be sure to have a fiddler trace of the error that you are experiencing.

05/07: Errata added for [MS-SMB2]: Server Message Block (SMB) Protocol Versions 2 and 3

05/07: Errata added for [MS-TSWP]: Terminal Services Workspace Provisioning Protocol

05/07: Errata added for [MS-RDPBCGR]: Remote Desktop Protocol: Basic Connectivity and Graphics Remoting


05/07: Errata added for [MS-RDPEGFX]: Remote Desktop Protocol: Graphics Pipeline Extension

05/07: Errata added for [MS-RDPELE]: Remote Desktop Protocol: Licensing Extension

05/21: Errata added for [MS-ADTS]: Active Directory Technical Specification

What has been happening at Microsoft – May 14th – May 18th 2018

$
0
0

Microsoft News:

 Microsoft US Partner:

 Microsoft Accessibility:

 Microsoft Flow:

Microsoft Power BI:

 Microsoft Secure:

 Microsoft 365:

 Microsoft Azure:

 Microsoft Azure Government Cloud:

 Microsoft PowerApps:

Microsoft Internet Of Things:

 

 Microsoft Industry Blogs:

  

Upcoming Events:

 

 

 

aaaaa

Investigating issues with MSA login failure in multiple regions – 05/22 – Mitigated

$
0
0

Final Update: Tuesday, May 22nd 2018 04:59 UTC

We have confirmed that all systems are back to normal as of May 22nd 2018 03:40 UTC. Our logs show the incident started on May 22nd 2018 02:28 UTC. Our Partner team has identified preliminary cause to be related to a recent deployment task that impacted instances of a backend service, causing excessive network traffic, preventing authentication requests from completing. The issue is mitigated after a manual failover is done on the affected backend service.

Impact: Customers may have experienced intermittent errors while attempting to login to their Visual Studio Team Services accounts through MSA (personal Microsoft accounts).

Sorry for any inconvenience this may have caused.

Sincerely,
Rakesh Reddy


Initial Update: Tuesday, May 22nd 2018 03:55 UTC

Users might experience issues while logging into their VSTS accounts. This impact is currently limited to MSA account login. We are actively working with our partner team on this issue.

  • Next Update: Before Tuesday, May 22nd 2018 04:55 UTC

Sincerely,
Vamsi

Experiencing Microsoft Account login issue in Azure Portal – 05/22 – Resolved

$
0
0
Final Update: Tuesday, 22 May 2018 04:04 UTC

We've confirmed that all login systems are back to normal with no customer impact. Our logs show the incident started on 05/22/2018, 02:35 UTC and that during the 1 hour that it took to resolve the issue about 10% of customers experienced Trouble logging in.
  • Root Cause: The failure was due to Microsoft Account login failures.
  • Incident Timeline: 1 Hours & X minutes - 05/22, 02:26 UTC through 5/22, 03:35 UTC

We understand that customers rely on Application Insights as a critical service and apologize for any impact this incident caused.

-Jeff Miller




    Update: Tuesday, 22 May 2018 03:39 UTC

Root cause has been isolated to a Microsoft Account authentication issue which was impacting logins to the service.  We are investigating and currently mitigating the issue.

  • Next Update: Before 05/22 05:00 UTC

-Jeff Miller



Demystifying Windows in 10 S Mode

$
0
0

Since the introduction of Windows 10 S last year – an effort to provide a Windows experience that delivers predictable performance and quality through Microsoft-verified apps via the Microsoft Store- there has been much confusion around what this actually means despite great feedback from customers and partners. Customers love the security, faster boot time, better battery life and consistent performance over time, however it seemed the naming was a bit confusing for both customers and partners.

 

Based on that feedback, Microsoft are simplifying the experience for our customers. They expect the majority of customers to enjoy the benefits of Windows 10 in S mode. If a customer does want to switch out of S mode, they will be able to do so at no charge, regardless of edition. 

 

Microsoft hope this new approach will simplify and make it possible for more customers to start using Windows in S mode: a familiar, productive Windows experience that is streamlined for security and performance across all our editions.

Read the full announcement here. 

With this in mind, if you're considering Windows 10 in S mode for your educational institution then check out the useful information below to ensure you are fully informed about what this great Windows package, perfect for Primary Education, is really about. 

 


Microsoft Education: Windows 10 in S Mode Overview

Windows 10 Pro in S mode is an enhanced mode of Windows 10 Pro that’s streamlined for security and superior performance. Watch as Shindy Skaar walks us through the important features in Windows 10 in S mode, who it’s for, and why it’s perfect for schools.

 


Windows 10 Pro in S mode: Great for Education

Windows 10 Pro in S mode is an enhanced security mode of Windows 10 – streamlined for security and superior performance. Visit the dedicated Windows 10 in S mode page to find out all the additional information you will need to make the right decision when considering Windows 10 in S mode. 

https://www.microsoft.com/en-us/education/products/windows-10-s/default.aspx

 


Looking to buy classroom devices with Windows 10 in S mode?

Many school's fall into the trap of believing that Windows devices are outside of their budget for education, however this is not the case. With Microsoft you can get more than you imagined, for less than you thought with classroom PCs built with Windows in 10 S mode perfect for education. Better yet, you can now buy devices direct from Microsoft Store with great prices, once you exclude VAT for Educational institutions, making it easier than ever for you to get the devices that are right for your students quickly and easily. Take a look at the Microsoft Store page to begin your school's journey.

https://www.microsoft.com/en-gb/store/b/classroompcs

Azure Database Migration Service を使って SQL Server を Azure SQL Database に持っていこう

$
0
0

Microsoft Japan Data Platform Tech Sales Team

岩淵 健

これまでプレビュー版としてのご提供でした Azure Database Migration Service (Azure DMS) 2018/5/7 より GA としてリリースされました。Azure DMS をご利用いただくことでオンプレミスの SQL Server データベースを Azure SQL Database へスムーズに移行することができます。特に、他の移行ツールと比較して、大規模なデータベースの移行に適していますので是非ともご利用ください。

本エントリでは SQL Server 2008 R2 のデータベースを Azure DMS を使って Azure SQL Database へ移行する手順をご紹介させていただきます。
ご存知の方も多いかと思いますが、SQL Server 2008/2008 R2 は2019年7月をもってサポートを終了いたしますので、移行を検討いただく際の参考にしていただけますと幸いです。

Azure DMS を利用したデータベース移行の流れ

Azure DMSを利用してデータベースの移行を行う場合、移行を開始する前に次のような流れで Data Migration Assistant を使用し、移行を妨げる問題がないか評価し必要な修正を行ったうえで、データベースを移行していきます。

Data Migration Assistantによるスキーマの移行

  1. 移行元データベースの評価
  2. スキーマの移行

Azure DMSによるデータの移行

  1. Azure DMS インスタンスの作成
  2. Azure DMS インスタンスでの移行プロジェクトの作成
  3. データ移行

 

1. 移行元データベースの評価

それでは早速、移行元データベースの評価から始めていきましょう。
Data Migration Assistant はこちらからダウンロードいただけます。

  1. Data Migration Assistant を起動して "+" アイコンから新規プロジェクトを作成します。
  2. Project type [Assessment] を選択します。
  3. Source server type  [SQL Server]  Targert server type に [Azure SQL Databse] を選択します。
  4. 任意のプロジェクト名をつけて [Create] を選択しプロジェクトを作成します。
  5. オプション選択画面で、[Check database compatibility]  [Check feature parity] を選択し、データベースの互換性と機能の類似性を確認します。

  1. 移行元の接続情報を入力し、移行対象のデータベースを選択します。

  1. ダイアログを閉じたら [Start Assessment ] を選択し、評価を開始します。
    評価が完了すると次のようなレポートが表示されます。

 

今回の環境には、SQL Server 2012 でサポートが終了した WITH APPEND を含むトリガーが含まれていましたが、しっかりとこれが検知されています。本来であれば、このような互換性の問題を解消してから次のステップに進みますが、今回は検証ということでそのまま進んでみることにします。

2. スキーマの移行

評価が完了したら、再度 Data Migration Assistant を使用して、対象スキーマを移行します。

  1. Data Migration Assistant を起動して "+" アイコンから新規プロジェクトを作成します。
  2. Project type [Migration] を選択します。
  3. Source server type [SQL Server] Targert server type に [Azure SQL Databse] Migration Scope [Schema only] を選択します。
  4. 任意のプロジェクト名をつけて [Create] を選択しプロジェクトを作成します

  1. 移行元の接続情報を入力し、移行対象のデータベースを選択します。

  1. 移行先の接続情報を入力し、移行先の SQL Database を選択します。
    今回は、事前に作成済みの環境に移行しますが、"Create
    a new Azure SQL Database" から新規に移行先のデータベースを作成することも出来ます。

  1. [Next] を選択し、移行対象のオブジェクトを選択する画面に進みます。
    規定では、すべてのオブジェクトが選択された状態になりますが、チェックボックスで移行対象から外すことも可能です。

  1.  [Generate SQL script] を選択し移行スクリプトを確認します。

スクリプトに問題が検知された場合は、ここで修正します。
検証で検知された WITH APPEND を含むトリガーはこのステップではエラーとして検知されませんでした。
作成されたスクリプトを確認すると、次のように
sp_executesql で実行されているため実行時までエラーとして検出されない状態となっていました。このようなケースがありますので、事前の検証で検知された問題を解消しておくことが重要となります。

  1. [Deploy schema] を選択し、スキーマをデプロイします。

問題がないか [Deployment results]を確認します。
エラーが発生した場合は、次のように表示されますので必要に応じて対応します。

3.  Azure DMS インスタンスの作成

ここからはブラウザを起動して Azure Portal での作業となります。

  1. Azure Portal にログインして、[すべてのサービス] から [サブスクリプション] を選択します。

  1. Azure DMS インスタンスを作成するサブスクリプションを選択し、[リソース プロバイダー] から Microsoft.DataMigration の右側の [登録] を選択します。

 

  1.  Azure Portal で [+ リソースの作成] から Azure Database Migration Service を検索し Azure ADM の画面から[作成] を選択します。

  1. 移行サービスの作成画面からサービス名やサブスクリプションなどの必要な情報を入力しサービスを作成します。

 

4.  Azure DMS インスタンスでの移行プロジェクトの作成

  1.  [すべてのサービス] から Azure Database Migration Service を選択し、先ほど作成した移行サービスを選択します。
  2. 移行サービスを選択し、[新しい移行プロジェクト] 画面からプロジェクトを作成します。
    ソースには [SQL Server] ターゲットには [Azure SQL Database] を選択します。

  1. 移行元と移行先の接続情報を入力し移行対象のデータベースを選択します。
    このステップで Azure DMS からソース データベースへ接続できる状態である必要があります。うまく接続できない場合は、ネットワークやファイアウォールの設定なども確認してみてください。筆者はここでハマりました。
  2. 入力した情報を確認し保存します。

5.  データ移行

ここからいよいよ実際の移行作業に入ります。

  1. 先ほど作成したプロジェクトを選択し [+新しい活動] から [移行を実行する] を選択します。

  1. 移行元と移行先の接続情報を入力して、対象のデータベースをマッピングします。

  1. 移行対象のテーブルを選択し [保存] します。

  1. プロジェクトの活動名を指定し、オプションを設定します。
    今回は
    [データベースの検証] から次の3つを選択しました。
    • スキーマ比較
    • データの整合性
    • クエリの正確さ

  1. [保存] を選択し、移行の概要に問題がなければ [移行の実行] を選択し移行を開始します。

  1. 移行が完了したらレポートから移行の状況を確認します。
    今回は、次のようなレポートになりました。Code objects が一つ少ないのは、WITH APPEND を含むトリガーを修正せずに移行を行ったためです。

以上でデータベースの移行が完了しました。
いかがでしたでしょうか?
問題点の検知からデータベースの移行まで簡単に行えると思います。
ぜひ Azure DMS を利用してデータベースを Azure に移行してみてください。

また、本サービスですが、今後ソース データベースとして Oracle をはじめ Netezza や MySQL、PostgresSQL に対応していく予定となっております。
移行先としても Azure SQL Database Managed Instance がご利用いただけるようになる予定となっておりますので、様々なデータ ソースを本サービスで簡単に Azure Data Service へ移行できるようになりますので是非ともご活用ください。

関連情報として Data Migration Assistant については、こちらのエントリにも記載がございますので、あわせてご参照ください。

 

Dynamics 365 – Basics about Sales tax documentation

$
0
0

If you are interested in the basic setup of Sales tax - the attached document could be the perfect starting point in order to understand the required setup and automatic processes in Dynamics 365 for Finance and Operations.

BASICS-ABOUT-SALES-TAX-in-Microsoft-Dynamics-

Debug Dynamics 365 for Finance and Operations on-premises with Visual Studio remote debugger

$
0
0

In this article I’m going to explain how to use Visual Studio Remote Debugger to debug a Dynamics 365 for Finance and Operations AOS in an on-premises environment. Why would you want to do that? Well, if you have an issue occurring in an on-premises environment that you can't reproduce on your developer (also known as Tier1/onebox/dev box) environment, this allows you to attach Visual Studio from the developer environment to the on-premises AOS and debug X++ code.

There's another related article on here, to debug an on-premises AOS without Visual Studio, which may be useful depending on your circumstances.

Overview

The basic gist of this process is:
1. Use a D365 developer environment which is on the domain (and of course the network) with the AOS machine
2. Copy the remote debugging tools from developer environment to the AOS
3. Run the remote debugger on the AOS
4. Open Visual Studio on the developer environment and attach to the remote debugger on the AOS
5. From this point debug as normal

First let’s talk about why I’m using a developer environment which is joined to the domain: The remote debugger has a couple of authentication options – you can either set it to allow debugging from anyone (basically no authentication), or to use Windows authentication. It’s a bit naughty to use the no authentication option, although the remote debugger wouldn’t be accessible from the internet, it’s still allowing that access to the machine from the network without any control on it. So we’ll use the Windows authentication option, which means we need to be on the domain.

There’s nothing special about adding a developer environment to the domain, join as you would any other machine - I won't go into that here.

Copy the remote debugger to the AOS

On the developer environment you'll find "Remote Debugger folder" on the Windows start menu:

Copy the x64 folder from there, and paste it onto the AOS you're going to debug. Note that if you have multiple AOS in your on-premises environment, turn off all but one of them - so that all requests will go to that one AOS that you're debugging. Within the folder double click msvsmon.exe:

The remote debugger will open, and look something like this, take note of the machine name and port, in my case it's SQLAOSF1AOS1:4020.

Configure the developer environment

Now move over to the developer environment, log on as an account which is an Administrator of both the developer machine and the AOS machine you want to debug. Open Visual Studio and go to Tools, Options, set the following options:

Dynamics 365, Debugging: Uncheck "Load symbols only for items in the solution"
Debugging, General: Uncheck "just my code"
Debugging, Symbols: add paths for all packages you want to debug, pointing to the location of the symbols files on the AOS you want to debug, because my account is an Administrator on the AOS box I can use the default C$ share to add those paths, like this:

Close the options form, and then go to Debug, Attach to process.., in the window that appears set the qualifier to the machine and port we saw earlier in the remote debugger on the AOS machine, in my case it was SQLAOSF1AOS1:4020. Then at the bottom click "Show processes from all users" and select the AXService.exe process, this is the AOS.

You'll get a warning, click attach.

On the AOS machine, you'll see in the remote debugger that you've connected:

Now open some code and set a breakpoint, in my case I'm choosing FormCustTable.init(), then open the application in the browser and open the form to hit your breakpoint.

Switching between source code files

When you try to step into a different source file, for example if I want to step from CustTable.init() down into TaxWithholdParameters_IN::find(), then I need to open the code for TaxWithholdParameters_IN manually from the Application explorer (AOT) before I step into it, if you don't do that you'll get a pop up window asking you where the source code file is - if that happens, then you can just cancel the dialog asking for the source file, go open it from the AOT, and then double click on the current row in the call stack to force it to realize you've got the source file now.

Happy debugging!

LCS (May 2018 – release 2) release notes

$
0
0

The Microsoft Dynamics Lifecycle Services (LCS) team is happy to announce the immediate availability of the release notes for LCS (May 2018, release 2).

Performance and reliability improvements

In this release of LCS we have made performance improvements to the Cloud hosted environments page. We have also addressed feedback on improving the load times for the Update tiles and Issue search features.

Deleting Organization users is now a soft delete

As an Organization admin, when you delete a user from the Organization users page, it is now a soft delete. The line in the grid will be color-coded to show that the delete was a soft delete. The user details are saved in LCS to ensure that there is an audit trail of the actions taken by the user.

 

 

Viewing all 29128 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>