Quantcast
Channel: MSDN Blogs
Viewing all 29128 articles
Browse latest View live

Getting hands on with Visual Studio for Mac, containers, and serverless code in the cloud

$
0
0

We are looking to improve your experience on the Visual Studio Blog. It would be very helpful if you could share your feedback via this short survey that should take less than 2 minutes. Thanks!

This week we released the second alpha of Visual Studio for Mac version 7.2, as well as some hands-on labs to try out some of the new features.

Visual Studio for Mac 7.2 Alpha

The alpha version of our next Visual Studio for Mac release includes new features such as:

  • Docker Containers – Join the microservices revolution, by testing and deploying your ASP.NET Core web apps to Docker containers. Visual Studio for Mac’s Docker support lets you easily deploy to a container as well as debug projects across multiple containers. Check out the hands-on-lab below to get started!
  • Xamarin Live Player – Get started building Xamarin mobile applications in minutes! Visit xamarin.com/live to learn how easy it is to try out mobile app development with your existing iOS or Android device and Visual Studio for Mac.
  • Azure Functions – Build and deploy serverless code in the cloud. Functions can be accessed by mobile or server apps, scheduled or triggered, and you only pay for the time they run. Follow the hands-on-lab below to write your first Azure Function.
  • IoT projects – Build, test, and deploy apps for Internet of Things devices. You can write IoT apps using C#, and deploy them to a Raspberry Pi, following our simple instructions.

To try out these features, download Visual Studio for Mac and switch to the alpha channel in the IDE.

More Hands-On Labs

Our latest hands-on labs for Visual Studio for Mac will help you get started with new features available in the 7.2 alpha. Visit the VS4Mac labs GitHub repo for past weeks’ projects using the Unity 3D game engine, Internet of Things devices, ASP.NET Core web sites, and Xamarin for mobile app development.

Today we’ve published two additional labs using: Docker Container support and Azure Functions projects.

Lab 5: Deploying ASP.NET Core to a Docker Container

Lab 3 demonstrated how to build, test, and debug an ASP.NET Core website on your Mac. This lab will show you how to run and debug an ASP.NET web site and web API in Docker containers, by completing these 4 tasks:

  1. Create a Docker-enabled ASP.NET Core web site
  2. Create a Docker-enabled ASP.NET Core web API
  3. Integrate two container apps
  4. Debug multi-container solutions

Follow the complete instructions to set up the two ASP.NET Core projects, make them work together, and debug them simultaneously.

Deploying ASP.NET Core to a Docker Container

Lab 6: Serverless computing with Azure Functions

“Serverless computing” is a new type of cloud feature where you can host a “function” without having to worry about setting up a server, or even an application, to run it in. Simply build and deploy your Azure Function, and it will be automatically hosted and scaled as required. You only pay for the time the function is running, it can respond to application requests, you can set up triggers, and it can access many different Azure services.

To build your first Azure Function and get started with serverless computing, follow these 5 steps:

  1. Create an Azure Functions project
  2. Create an Azure Storage account
  3. Create and Debug an Azure Function
  4. Work with function.json
  5. Work with Azure Tables

The hands-on-lab instructions will walk you through creating the Azure Functions project in Visual Studio for Mac, deploying it to Azure, and persisting data with Azure Tables. This feature is so new it is only available in the Alpha channel release of Visual Studio for Mac. You’ll need to install an extension for Azure Functions, which the instructions will help you with.

Get Started

Download Visual Studio for Mac today, and visit the VS4Mac labs repo on GitHub. Both this week’s labs just scratch the surface of the capabilities being demonstrated: Docker support enables new testing and deployment options, and Azure Functions opens up a new, easier way to interact with powerful Azure services.

With the Visual Studio Community edition it is easy and free to get started. Check out the docs for more in-depth information on Visual Studio for Mac, and leave a comment below to suggest additional hands-on-labs you’d like to see.

Craig Dunn, Principal Program Manager
@conceptdev

Craig works on the Mobile Developer Tools documentation team, where he enjoys writing cross-platform code for iOS, Android, Mac, and Windows platforms with Visual Studio and Xamarin.


Issues with Visual Studio Team Services – 08/31 – Investigating

$
0
0

Update: Thursday, August 31st 2017 16:29 UTC

We are experiencing intermittent issues in VSTS in the South Central region. Users will see symptoms including slowness and failures (e.g. - HTTP 503) using the web experience and some builds not completing. This root cause appears to be related to a single web tier node that started after a recent deployment. We've taken a memory dump on this node and pulled it out of the network load balancer. We're now working to confirm if this resolved the issue and will follow-up soon when we know more.

  • Next Update: Before Thursday, August 31st 2017 17:45 UTC

Sincerely,
Tom


Initial Update: Thursday, August 31st 2017 15:47 UTC

A potentially customer impacting alert is being investigated. Triage is in progress and we will provide an update with more information.

  • Next Update: Before Thursday, August 31st 2017 17:00 UTC

Sincerely,
Niall

Announcing TypeScript 2.5

$
0
0
Today we're happy to bring you TypeScript 2.5! If you've read our RC announcement, we've got a few new items that we're proud to highlight!

If you're not familiar with TypeScript, it's a typed superset of JavaScript. More simply put, it's just JavaScript with optional static types. Static types can make it easier to maintain your code by catching bugs early on, making it easier to navigate your projects, giving accurate code completion, and providing handy fixes for when you do make mistakes. By making types optional, you can get the flexibility of plain JavaScript when you need it, all while TypeScript also gives you the option to tighten things up to bring you more type safety. You can learn more about TypeScript on our website.

To start using TypeScript, you can grab it through NuGet or use the following command with npm:

npm install -g typescript

Visual Studio 2015 users (who have Update 3) can install TypeScript 2.5 from here, and Visual Studio 2017 users using version 15.2 or later will be able to get TypeScript by simply installing it from here.

While TypeScript 2.5 will be available for other editors soon, in the meantime you can configure Visual Studio Code and Sublime Text to use a newer version. Other editors may have different approaches to getting TypeScript 2.5 running.

Let's look at what TypeScript 2.5 brings!

The Extract Function and Extract Method refactorings

Our team is always in search of ways to bring more powerful tools to the TypeScript and JavaScript world. That's why with TypeScript 2.5 we've invested a lot into implementing extract method and extract function: two new refactorings that make complex rewrites trivial.

If you're using Visual Studio Code, this refactoring will be available in the upcoming release (though you can try it now by using VS Code Insiders releases).

This feature is still fairly new so we expect that there will still be room for improvement, but we're excited to hear your feedback so we can polish things out.

New quick fixes

We've also added a few quick fixes for when TypeScript can guess a little bit at what you meant to write.

One new quick fix will get triggered when you try to use JSDoc-style types in TypeScript. If you're in the habit of writing types like these, you might be surprised to find out that they're not valid in TypeScript, but TypeScript is happy to push us in the right direction.

Quick fixes correcting JSDoc-style types to TypeScript-style types.

We've also added a quick fix for when you try to reference the type of a property off of another type incorrectly. For example, for the following code

interface Foo {
    bar: number;
}

We might want to declare a variable named xyz whose type is tied to to the type of bar. The correct way to write this would be using an indexed access type:

// Get the type of the property named 'bar' off of 'Foo'.
var xyz: Foo["bar"]

but we might accidentally write var xyz: Foo.bar. TypeScript now can now suggest the correct one in many cases.

A quick fix that corrects `Foo.bar` to `Foo['bar']`.

JSDoc type assertion support in JavaScript files

In TypeScript 2.4, we introduced the ability to get type-checking in JavaScript files so that users can more easily migrate to TypeScript, and have an easier experience with certain more lightweight projects. Taking advantage of this is as simple as adding a // @ts-check at the top of your .js file, or turning on the checkJs flag in yout tsconfig.json's compilerOptions.

One thing that it lacked was the ability to "cast" or "assert" the type of an expression. This is important for situations where you know a little more than the type-checker and need to tell it so. To come up with a trivial example, let's take the following JavaScript code:

// @ts-check

var foo = Math.random() ? "hello" : 100;

foo.toUpperCase();
//  ~~~~~~~~~~~
//  Error! Property 'toUpperCase' does not exist on type 'string | number'.

TypeScript correctly indicates that we might be calling a method that doesn't exist on numbers. If we wanted to get around this so we can easily get our runtime error, we could write a JSDoc type assertion:

// Works!
var bar = /** @type {string} */ (foo);
bar.toUpperCase();

The syntax is /** @type {YOUR_TYPE_HERE} */ (someParenthesizedExpression).

Keep in mind that if you've enabled JavaScript checking on a file, invalid type assertions will still get caught:

var clearlyNumber = /** @type {string} */ (100);
//                      ~~~~~~~~~~~~~~
// Error! Type 'number' cannot be converted to type 'string'.

Optional catch clauses

Thanks to work by Tingan Ho, TypeScript 2.5 brings a new ECMAScript-bound feature for making catch clauses optional. Much of the time, you'll find yourself writing a try/catch but not really caring about the thrown error. For example:

let contents;
try {
    contents = fs.readFileSync(".config_file").toString('utf8');
}
catch (unusedError) {
    // File might not exist, just fall back to some defaults.
    contents = createDefaultContents();
}

Notice that unusedError is never referenced in the above example. Barring philosophical issues about whether it's appropriate to ignore the error, we can make our code a little cleaner by taking advantage of the fact that the catch variable is now optional.

let contents;
try {
    contents = fs.readFileSync(".config_file").toString('utf8');
}
catch {
    // File might not exist, just fall back to some defaults.
    contents = createDefaultContents();
}

Deduplicated and redirected packages

When importing using the Node module resolution strategy in TypeScript 2.5, the compiler will now check whether files originate from "identical" packages. If a file originates from a package with a package.json containing the same name and version fields as a previously encountered package, then TypeScript will redirect itself to the top-most package. This helps resolve problems where two packages might contain identical declarations of classes, but which contain private members that cause them to be structurally incompatible.

As a nice bonus, this can also reduce the memory and runtime footprint of the compiler and language service by avoiding loading .d.ts files from duplicate packages.

The --preserveSymlinks compiler flag

TypeScript 2.5 brings the preserveSymlinks flag, which parallels the behavior of the --preserve-symlinks flag in Node.js. This flag also exhibits the opposite behavior to Webpack's resolve.symlinks option (i.e. setting TypeScript's preserveSymlinks to true parallels setting Webpack's resolve.symlinks to false, and vice-versa).

In this mode, references to modules and packages (e.g. imports and /// <reference type="..." /> directives) are all resolved relative to the location of the symbolic link file, rather than relative to the path that the symbolic link resolves to. For a more concrete example, we'll defer to the documentation on the Node.js website.

Enjoy!

We hope our work in TypeScript 2.5 will make you happier and more productive. If it did, let us know on Twitter with the #iHeartTypeScript hashtag.

Out team appreciates any sort of feedback on how we can improve. Feel free to let us know about any issues you run into or helpful ideas you think might make TypeScript even better to use on our GitHub issue tracker.

As for what's next, we've caught most of what's new on this blog post, but you can always check out our what's new in TypeScript page on our wiki for some more details, and keep an eye on our Roadmap that highlights our current plans and is frequently updated.

Thanks for reading and happy hacking!

Amazon AWS and new NDepend pricing plans in August’s VSTS extension round-up

$
0
0

Since the creation of the Marketplace, we have seen strong demand for tools to work with Amazon Web Services. I am so thrilled that this month the search for those tools comes to an end. In addition to Amazon releasing their AWS Tools, our partners at NDepend have recently lowered the prices on their static analysis tools for .NET code. This is such a great month for our customers and I hope you'll give all of these extensions a look.

AWS Tools for Microsoft Visual Studio Team Services

See it in the Marketplace here: https://marketplace.visualstudio.com/items?itemName=AmazonWebServices.aws-vsts-tools

Earlier this month, our partners at Amazon published their VSTS tools for AWS. This was a really cool partnership between the teams at Amazon and Microsoft's community of ALM Rangers. You can read Amazon's blog announcement as well as check out the source on GitHub.

Screenshot showing just a few of the included build and release tasks provided by the AWS Tools extension

This extension includes a very expansive set of build and release tasks for interacting with varying AWS services as well as a new service endpoint called 'AWS' for easy authentication to their services. Amazon's blog post has a very detailed list of all the included tasks and I am including it here for your convenience:

  • AWS CloudFormation Create/Update Stack. This task enables you to create or update a stack in AWS CloudFormation by using a template file and an optional parameters file. The task switches automatically between updating an existing stack or creating a new stack, depending on whether the stack already exists. You don't need to select a "mode", which makes this task convenient to use in pipelines. In addition to choosing the template and parameters file, you can elect to use a change set to create or update the stack, with the added option to automatically execute the change set (if it validates successfully). Or you can use the Execute Change Set task to execute the validated change set at a later time.
  • AWS CloudFormation Delete Stack. This task deletes a stack identified by name or ID. You might use it to clean up development or test environment stacks after a new, fresh deployment in a tear-down-and-rebuild scenario.
  • AWS CloudFormation Execute Change Set. As stated earlier, the Create/Update Stack task gives you the option to perform changes using a change set and, if the set validates, to execute the changes immediately or by using this task at a later time. You provide the name of the change set and the associated stack and the task does the rest, waiting for the stack to reach create or update complete status.
  • AWS Elastic Beanstalk Deployment. With this task you can deploy traditional ASP.NET applications using WebDeploy archives or deploy ASP.NET Core applications.
  • AWS Lambda .NET Core Deployment. This task enables deployment of standalone functions or serverless applications to AWS Lambda. The task uses the same dotnet CLI extensions as the AWS Visual Studio Toolkit, so you have the full customization capabilities of the command line tool switches available within the task.
  • AWS Lambda Invoke Function. In addition to deploying to AWS Lambda, you use this task to trigger Lambda functions to run from within your pipeline. The results of the function can be emitted into a variable for subsequent tasks in your pipeline to consume.
  • AWS S3 Download. Using a combination of bucket name and optional key prefix, this task uses a set of one or more globbing patterns to enable the download of content from an Amazon S3 bucket into your pipeline's working folders. For example, you can use this to inject custom static content into a build.
  • AWS S3 Upload. Similarly to the S3 download task, this task takes a bucket name and set of globbing patterns to be run in a source folder to upload content from the pipeline's working folders to a bucket.
  • AWS Tools for Windows PowerShell Script. This task enables you to run scripts that use cmdlets from the Tools for Windows PowerShell (AWSPowerShell) module, optionally installing the module before the script runs.
  • AWS CLI. This task enables you to run individual AWS CLI commands. However, you must have already installed the AWS CLI into the build host.

For rich set-up instructions be sure to check out the Marketplace listing page and Amazon's announcement blog.

Code Quality NDepend for TFS 2017 and VSTS

See it in the Marketplace here: https://marketplace.visualstudio.com/items?itemName=ndepend.ndependextension

This solution from NDepend is something I have highlighted before in the blog, but this time is a bit different because they have all new pricing plans that make the extension available to even more customers!

With over 200 installs and a solid 5-star rating, NDepend is one of the best code quality solutions in our Marketplace and the extension adds a bunch of new features to your VSTS experience

  • NDepend Dashboard Hub shows a recap of the most relevant data including technical debt estimations, code size, Quality Gates status, rules and issues numbers.
  • Quality Gates which are code quality facts that must be enforced before committing to source control and eventually, before releasing
  • Rules that check your code against best practices with 150 included inbox and the ability to create custom additions
  • Technical Debt and Issues which is generated from checking your code against industry best practices, including deep issue drill down.
  • Trends are supported and visualized across builds so you can see if you're improving, or adding more technical debt.
  • Code Metrics recap for each assembly, namespace, class or method.
  • Build Summary recap is shown in each Build Summary:

With the new pricing plans for NDepend, you can enable a code quality engineer on your team to use all of these tools starting at $19 USD/month.

For full pricing info, check out the pricing tab of their Marketplace listing here: https://marketplace.visualstudio.com/items?itemName=ndepend.ndependextension

Are you using (or building) an extension you think should be featured here?

I'll be on the lookout for extensions to feature in the future, so if you'd like to see yours (or someone else's) here, then let me know on Twitter!

@JoeB_in_NC

Sentiment Analysis of US Presidential Inaugural Addresses

$
0
0

In this blog post, I will take a look at how we can use Azure Cognitive Services Text Analytics API to analyze speeches in terms of their sentiment. This is continuation of my previous explorations of the Text Analytics API, I made an e-book sentiment analyzer, which I encourage you to read to get more details on how to use the Text Analytics API. I will be analyzing all US Presidential inaugural addresses from George Washington up through Donald J. Trump. Keep reading, the results may surprise you.

I am not trying to make any political statements with this analysis. It is simply an example of using Cognitive Services to look at a collection of speeches. 

The source of the data used in this analysis can be found on the American Presidency Project website. I will illustrate with some code how one can extract the actual text of each inaugural address. The source code is available on GitHub, and the most important pieces will be reproduced in the blog post.

TL;DR - Just Give Me The Results

I know that some readers will just be here for the results and will not care much about the implementation, so I will spare them the agony of reading through the entire post. Here are the results:

Summary statistics:

Mean 0.82
Median 0.84
Standard Deviation 0.15

 

The 10 most positive (in terms of sentiment) inaugural addresses:

President Date Sentiment
1 Thomas Jefferson March 4, 1801 1.00
2 William Henry Harrison March 4, 1841 1.00
3 Theodore Roosevelt March 4, 1905 1.00
4 George Washington April 30, 1789 1.00
5 Franklin Pierce March 4, 1853 0.99
6 James Monroe March 4, 1817 0.98
7 Zachary Taylor March 5, 1849 0.98
8 Woodrow Wilson March 4, 1913 0.98
9 Ulysses S. Grant March 4, 1869 0.98
10 Grover Cleveland - I March 4, 1885 0.97

 

The 10 most negative (in terms of sentiment) inaugural addresses:

President Date Sentiment
1 George Washington March 4, 1793 0.42
2 James Madison March 4, 1813 0.44
3 Abraham Lincoln March 4, 1865 0.49
4 Lyndon B. Johnson January 20, 1965 0.49
5 John F. Kennedy January 20, 1961 0.53
6 Barack Obama January 20, 2009 0.57
7 Thomas Jefferson March 4, 1805 0.65
8 George W. Bush January 20, 2001 0.67
9 Andrew Jackson March 4, 1833 0.70
10 Franklin D. Roosevelt January 20, 1945 0.71

 

One observation is that Donald J. Trumps recent "American carnage" inaugural address is not on either top 10 list. It has a score of 0.72, which is a bit below average, but not much so. This is also in agreement with other analyses of that address, that have found it to be relatively positive. My analysis shows it to be a bit below average for inaugural addresses, but in the broader context of speeches, 0.72 is still relatively positive. You can find it on the complete list of scores at the end of this blog. Barack Obama's first inaugural address and John F. Kennedy's inaugural address, are both in the top 10 of most negative sentiment, which may surprise some. However, if you read those addresses, I think you may agree that while they may be great speeches (this blogger does not presume to have an opinion about that), they also talk about some serious problems and challenges, which of course will influence the language. It just shows that a speech can be inspirational (or aspirational) without necessarily having a positive sentiment. Obama's second address was much more positive than the first.

Another observation is that there are more modern presidents on the negative top 10 list, which is also visible if we plot the sentiment as function of year:

 

 

It would appear that there is a slight downward trend in sentiment, which may just be a reflection of language changes over the years. We would need some more data from other types of speeches to dig into that. I will leave you to figure out when the addresses will start to be real downers if the current trend continues.

I will also leave the political analysis to others. The purpose of this is just to illustrate that one can get interesting trends and information with relatively little work using Cognitive Services.

Implementation - The Gory Details

So if you would like to do this yourself, it is actually relatively easy. I have written a utility C# library that use can use to analyze speeches or other text documents. The library is called "SpielInsights", since calling it SpeechInsights might confuse it with something that analysis spoken words (as in audio). The library will give you the sentiment of each paragraph of text and also the key phrases in each paragraph. It will also provide you with summary information for the entire document. In this blog post, I am really just reporting the overall sentiment of the document. This overall sentiment is calculated as a weighted average (by the relative length of each paragraph in characters) of the sentiment of all paragraphs in the text. You can find the source code for this utility library here.

In order to use the library, you will need to load your text into a "Spiel" object:

 

    public class Spiel
    {

        public Spiel()
        {
            Paragraphs = new List<string>();
        }

        public string SourceURI { get; set; }

        public string Speaker { get; set; }

        public string Category { get; set; }

        public DateTime Date { get; set; }

        public List<string> Paragraphs { get; set; }
    }

This contains a bit of information about the speaker and the time when the speech was given and a list of text paragraphs. This object can be passed to the AnalyzeSpiel function:

       static public SpielAnalytics AnalyzeSpiel(Spiel spiel, string apiKey)
       {
          ...
       }

Which will make as many calls as needed (based on length of text) to the Text Analytics API and then return some analytics:

 

    public class SpielParagraphAnalytics
    {
        public SpielParagraphAnalytics()
        {
            KeyPhrases = new HashSet<string>();
            Sentiment = 0;
            Words = 0;
            Characters = 0;
        }

        public long Words { get; set; }

        public long Characters { get; set; }

        public double Sentiment { get; set; }

        public HashSet<string> KeyPhrases { get; set; }
    }

    public class SpielAnalytics
    {
        public SpielAnalytics()
        {
            SummaryAnalytics = new SpielParagraphAnalytics();
            ParaGraphAnalytics = new List<SpielParagraphAnalytics>();
        }

        public SpielParagraphAnalytics SummaryAnalytics { get; set; }
        public List<SpielParagraphAnalytics> ParaGraphAnalytics { get; set; }
    }

Now that we have the basic structures and functions laid out, we need the data. I will just spend a bit of time here to show how I did the retrieval and cleaning of the data. As mentioned, the inaugural addresses can be found on the American Presidency Project web site, however, there is no database of the raw text that I could find, so I have made a list of each of the speeches with the links to their specific pages. You can find that list here. An excerpt from the list looks like this:

George Washington;http://www.presidency.ucsb.edu/ws/index.php?pid=25800;April 30, 1789
George Washington;http://www.presidency.ucsb.edu/ws/index.php?pid=25801;March 4, 1793
John Adams;http://www.presidency.ucsb.edu/ws/index.php?pid=25802;March 4, 1797
Thomas Jefferson;http://www.presidency.ucsb.edu/ws/index.php?pid=25803;March 4, 1801
Thomas Jefferson;http://www.presidency.ucsb.edu/ws/index.php?pid=25804;March 4, 1805
James Madison;http://www.presidency.ucsb.edu/ws/index.php?pid=25805;March 4, 1809
James Madison;http://www.presidency.ucsb.edu/ws/index.php?pid=25806;March 4, 1813
James Monroe;http://www.presidency.ucsb.edu/ws/index.php?pid=25807;March 4, 1817
James Monroe;http://www.presidency.ucsb.edu/ws/index.php?pid=25808;March 4, 1821

    ...

I have then written a small C# (.NET CORE) routine that runs through this list, finds the part of the HTML documents where the speech text is and puts each paragraph into the "Spiel" structure before sending it to the analysis routine. You can find the source code of this routine here, and since it is pretty short, I have reproduced it here:

using System;
using System.IO;
using System.Net.Http;
using System.Threading.Tasks;
using HtmlAgilityPack;
using SpielInsights;
using System.Collections.Generic;
using System.Text;

namespace Inaugurals
{
    class Program
    {
        static void Main(string[] args)
        {
            string inputList = args[0];
            string apiKey = args[1];
            string outputFileName = args[2];

            var client = new HttpClient();

            //Output file
            System.IO.StreamWriter outputFile = new System.IO.StreamWriter(outputFileName);

            Task.Run(async () =>
            {
                using (StreamReader reader = new StreamReader(inputList))
                {
                    string line;
                    while ((line = reader.ReadLine()) != null)
                    {
                        Spiel spiel = new Spiel();

                        string[] components = line.Split(";");

                        spiel.Speaker = components[0];
                        spiel.SourceURI = components[1];
                        spiel.Date = System.Convert.ToDateTime(components[2]);

                        Console.WriteLine("Processing Inaugural: " + components[1]);
                        var response = await client.GetAsync(components[1]);
                        var content = await response.Content.ReadAsStringAsync();

                        HtmlDocument htmlDocument = new HtmlDocument();
                        htmlDocument.LoadHtml(content);

                        foreach (HtmlNode node in
                                 htmlDocument.DocumentNode.SelectNodes("//span[@class='displaytext']"))
                        {
                            HtmlDocument innerHtmlDocument = new HtmlDocument();
                            innerHtmlDocument.LoadHtml(node.InnerHtml);

                            foreach (HtmlNode pnode in innerHtmlDocument.DocumentNode.SelectNodes("//text()"))
                            {
                                string paragraphText = pnode.InnerText.Trim();
                                spiel.Paragraphs.Add(paragraphText);
                            }

                            SpielAnalytics analytics = SpielInsights.SpielInsights.AnalyzeSpiel(spiel, apiKey);


                            //Build semicolon separated output records
                            StringBuilder osb = new StringBuilder();
                            osb.Append(components[0] + ";"); //Speaker
                            osb.Append(components[2] + ";"); //Date
                            osb.Append(analytics.SummaryAnalytics.Sentiment);

                            outputFile.WriteLine(osb.ToString());
                            Console.WriteLine(osb.ToString());
                        }
                    }
                }
            }).GetAwaiter().GetResult();

            outputFile.Close();
        }
    }
}

This routine simply loops through all the lines in the list of inaugural addresses, uses an HttpClient to receive each one and then cuts the right <span> in the HTML document that contains the actual speech text and loops through each section of it. The sections/paragraphs are added to the "Spiel" and then we retrieve analytics.

As you can see there is a lot more information in the analytics structures than I have presented here, but I will leave it to you to play more with it and see what interesting stuff you find. I may find some time to post it online in a way that can be searched and browsed. If I do, I will write about it on my blog, so please subscribe.

That's it. Have fun analyzing speeches or playing with the data presented here. Please rate the blog and post any comments you may have.

Complete Results

For completeness, I am adding all the sentiment scores for all inaugural addresses in chronological order.

President Date Sentiment
George Washington April 30, 1789 1.00
George Washington March 4, 1793 0.42
John Adams March 4, 1797 0.95
Thomas Jefferson March 4, 1801 1.00
Thomas Jefferson March 4, 1805 0.65
James Madison March 4, 1809 0.85
James Madison March 4, 1813 0.44
James Monroe March 4, 1817 0.98
James Monroe March 4, 1821 0.92
John Quincy Adams March 4, 1825 0.94
Andrew Jackson March 4, 1829 0.93
Andrew Jackson March 4, 1833 0.70
Martin van Buren March 4, 1837 0.75
William Henry Harrison March 4, 1841 1.00
James K. Polk March 4, 1845 0.91
Zachary Taylor March 5, 1849 0.98
Franklin Pierce March 4, 1853 0.99
James Buchanan March 4, 1857 0.97
Abraham Lincoln March 4, 1861 0.78
Abraham Lincoln March 4, 1865 0.49
Ulysses S. Grant March 4, 1869 0.98
Ulysses S. Grant March 4, 1873 0.81
Rutherford B. Hayes March 5, 1877 0.92
James Garfield March 4, 1881 0.77
Grover Cleveland - I March 4, 1885 0.97
Benjamin Harrison March 4, 1889 0.90
Grover Cleveland - II March 4, 1893 0.83
William McKinley March 4, 1897 0.91
William McKinley March 4, 1901 0.87
Theodore Roosevelt March 4, 1905 1.00
William Howard Taft March 4, 1909 0.94
Woodrow Wilson March 4, 1913 0.98
Woodrow Wilson March 4, 1917 0.72
Warren G. Harding March 4, 1921 0.78
Calvin Coolidge March 4, 1925 0.92
Herbert Hoover March 4, 1929 0.78
Franklin D. Roosevelt March 4, 1933 0.86
Franklin D. Roosevelt January 20, 1937 0.77
Franklin D. Roosevelt January 20, 1941 0.74
Franklin D. Roosevelt January 20, 1945 0.71
Harry S. Truman January 20, 1949 0.81
Dwight D. Eisenhower January 20, 1953 0.84
Dwight D. Eisenhower January 21, 1957 0.86
John F. Kennedy January 20, 1961 0.53
Lyndon B. Johnson January 20, 1965 0.49
Richard Nixon January 20, 1969 0.74
Richard Nixon January 20, 1973 0.80
Jimmy Carter January 20, 1977 0.85
Ronald Reagan January 20, 1981 0.73
Ronald Reagan January 21, 1985 0.80
George Bush January 20, 1989 0.83
William J. Clinton January 20, 1993 0.76
William J. Clinton January 20, 1997 0.87
George W. Bush January 20, 2001 0.67
George W. Bush January 20, 2005 0.85
Barack Obama January 20, 2009 0.57
Barack Obama January 21, 2013 0.84
Donald J. Trump January 20, 2017 0.72

Performance Issues with Visual Studio Team Services – 08/31 – Investigating

$
0
0

Initial Update: Thursday, August 31st 2017 19:14 UTC

We are investigating performance issues affecting users of the service. Users in SCUS may experience degraded performance or slow page load times.

  • Next Update: Before Thursday, August 31st 2017 20:15 UTC

Sincerely,
Venkata Sainath Reddy

Recognizing LaTeX Input in UnicodeMath Input Mode

$
0
0

In offering a LaTeX math input mode, we’ve run into the problem that a user might type some LaTeX while the UnicodeMath input mode is active and get something unintended and confusing. This post reveals ways in which the build-up engine can recognize this situation and maybe cue the user to switch to the LaTeX input mode. Furthermore, some purely LaTeX constructs, like frac{}{}, could be handled correctly in UnicodeMath input mode. It seems more user friendly to do so than to build up to an undesired result. UnicodeMath and LaTeX are compared a bit here.

Symbol Control Words

Control words for symbols work in either input mode, for example, alpha inserts α in both modes. So, there’s no need to change the input mode for symbol control words. Similarly, Unicode symbols like ∬ work in both input modes. The build-up engine supports Unicode LaTeX since the Office math facility was based on Unicode from the start. Note that UnicodeMath is defined in terms of Unicode symbols, not ASCII-letter control words, but the latter are supported by the input engine for ease of entry on standard keyboards. On-screen keyboards may offer more direct ways of entering Unicode symbols.

TeX Math Zone Delimiters

If a math zone begins with a $, the input must be TeX or LaTeX, since $ has no special significance in UnicodeMath and Office apps use the math-zone character format effect to define math zones. But the user might not start with a $, so it’s worth handling other ways that distinguish the formats. The LaTeX math-zone start delimiters [ and ( have useful meanings in UnicodeMath, namely to treat the [ and ( literally instead of treating them as autosizing build-up delimiters.

Structure and Environment Control Words

Some structure control words such as frac and binom are only defined in LaTeX and others like matrix and pmatrix are defined in both modes. The user pain enters when typing something like frac{a}{b} in UnicodeMath mode. The {…} get built up as curly braced expressions and the frac remains unchanged. No fraction results and the user may wonder what went wrong.

When the user types LaTeX-only structure control words like frac or binom in UnicodeMath input mode, it’s clear that LaTeX is intended and the user can be asked whether the input mode should switch to LaTeX. Similarly, structure control words valid in both input modes become unambiguous when the user types the argument start delimiter. For LaTeX the start delimiter is {, while for UnicodeMath it’s (. So, matrix( must be UnicodeMath, while matrix{ must be LaTeX. Note that LaTeX by design supports the original TeX control-word sequences like matrix{…} as well as the LaTeX environments like begin{matrix}…end{matrix}. In UnicodeMath autobuildup mode, no build up occurs when the user types matrix{, so it’s possible at that point to switch to LaTeX input without need for retyping.

Both input modes have begin and end, but in LaTeX these are environment control words followed by {, whereas in UnicodeMath they represent generic start/end delimiters for which curly braces would be superfluous. So as soon as the user types { following begin or end, a cue recommending a switch to LaTeX input mode can be displayed.

Math Functions

Math functions are also treated differently in LaTeX and in UnicodeMath. To enter the sine function in LaTeX, one types sin, whereas in UnicodeMath, one just types sin. So, if a math function name is entered preceded by , a cue recommending a switch to LaTeX input mode can be displayed. The Office math display engine needs to know the argument of a math function as well as the function name in order to insert the correct math spacing. LaTeX doesn’t have a formal way of defining the argument, although enclosing it in curly braces is a good idea. UnicodeMath has precise ways of defining the argument. This is also true for integrands of integrals and n-aryands of n-ary operators in general.

Superscripts and Subscripts

The input a^2+b^2=c^2 represents the same equation in either input mode, but a^10+b^10=c^10 represents a¹⁰ + b¹⁰ = c¹⁰ in UnicodeMath and a¹0 + b¹0 = c¹0 in LaTeX. It doesn’t seem possible to distinguish the user intent for such cases, but it’d be worth asking the user who types a^{ or a_{ whether to switch to LaTeX, since superscript and subscripts enclosed in curly braces aren’t common in mathematical expressions. Expressions involving exp{…} do occur, but it’s better typography to use exp{…} instead of raising e to a braced power.

Miscellaneous Control Sequences

Font control words like mathbf{ are distinctly LaTeX. The TeX binomial-coefficient construct {nchoose k} doesn’t make sense in UnicodeMath (one would type nchoose k without the curly braces). But {natop might be used in UnicodeMath since {natop k} would build up as n over k (without a fraction bar) enclosed in {}. Admittedly this construct is unlikely since binomial coefficients appear in parentheses, not in curly braces.

Conclusions

We see that there are quite a few [La]TeX constructs that don’t make sense in UnicodeMath and can be used to query the user about switching from UnicodeMath input mode to LaTeX input mode. In addition, such LaTeX-oriented control sequences could be handled directly in UnicodeMath mode. The math build-up engine in Microsoft Office uses the same operator and string stacks for both modes, so it’s fairly straightforward to treat constructs like frac{…}{…}, matrix{…}, begin{matrix}…end{matrix} directly in UnicodeMath mode. This might make math input more user friendly for people familiar with LaTeX. And it might facilitate migrating to the speedier, more mathematical UnicodeMath input mode. But it does compromise using the build-up engine as a UnicodeMath validator. To that end, if the build-up engine is modified to handle these LaTeX control sequences in UnicodeMath mode, it might be worth having a “strict” mode that would fail input with invalid UnicodeMath. In any event, build-down results are all in one format or the other, not in a mixture of the two.

TFS 2018 RC1 is now available for download!


Work Item linking broken for some users – 08/31 – Workaround

$
0
0

Starting
yesterday, you may have noticed that you are unable to create a new linked work
item or link to an existing work item from ‘Links’ tab of a work item. This is
a known issue affecting a subset of users. A fix is being rolled out and should
be deployed to affected scale units before 9/1/2017 4am UTC.

If
you are running into this issue, you can use one of the 2 workarounds listed
below to unblock yourself.

Workaround
1:

If you are trying to create a new linked work item, you can use the work item
form context menu

        


Workaround
2:

From a query view, you can use the context menu against each work item, or use
the linking options available on the ribbon.

        


Sincerely,
Sri Harsha

Work Item linking broken for some users – 08/31 – Workaround

$
0
0

Starting yesterday, you may have noticed that you are unable to create a new linked work item or link to an existing work item from ‘Links’ tab of a work item. This is a known issue affecting a subset of users. A fix is being rolled out and should be deployed to affected scale units before 9/1/2017 4am UTC.

If you are running into this issue, you can use one of the 2 workarounds listed below to unblock yourself.

 

Workaround 1: If you are trying to create a new linked work item, you can use the work item form context menu

 

Workaround 2: From a query view, you can use the context menu against each work item, or use the linking options available on the ribbon.

 

Sincerely,
Sri Harsha

Work Item Type and Inclusive Design

$
0
0

A little over a month ago, we rolled out work item type icons to all Visual Studio Team Services (VSTS) accounts and Team Foundation Server (TFS) with 2017.2.

It's been awesome to see @VSTS tweets, Developer Community feedback, and direct emails from customers that are very excited about the icons. Additionally, we've received a lot of questions about the motivation for the change and if it was necessary to replace the color bar for everyone. In this post I'll walk through what motivated our move to work item type icons, the design process, and some of our own learnings as we work to make VSTS accessible to everyone.

Identifying a potential problem

As part of our Visual Design tenet, color cannot be the sole means to convey meaning because users that cannot perceive all colors do not benefit from that information. While assessing the compliance of key WIT scenarios like viewing work items on the Kanban board, in work item grids, and in the links control we noticed that the work item type information is often omitted in favor of a color bar.

Mixture of Bug and User Story cards on a Kanban board. The only work item type information is a bar of color on the left of each card.

This becomes problematic when a user turns on high contrast mode because they immediately lose all work item type coloring, making every User Story, Bug, Feature, etc. appear the same.

The same Kanban board in high contrast mode. The User Story and Bug type is now indistinguishable.

Looking at the side by side of the board with and without high contrast mode enabled, it was clear that this potentially violated the Visual Design tenet. However, we needed to dig deeper to understand if users associated meaning to the color bar.

Defining the problem

Before we started exploring new designs, we first had to identify if this was a violation of the color meaning requirement - Does the color bar provide value to our users? When asked, both internal and external users stated that they use the color bar to quickly understand work item type and many remove the work item type column because it's unnecessary.

Since it is valuable, then What does a user with no visual impairment get from the work item type color bar? As we answered that question, we identified four important criteria our new design needed to meet:

  1. Recognizable - at a quick glance users should understand what types of work items are visible
  2. Minimalist - the new design should be compact and have a small visual impact
  3. Built in - the solution should be supported in VSTS as a first class citizen; no configuration necessary
  4. Flexible - users should be able to customize their work item type icons to fit the needs of their organization

Designing a solution

To meet our "Built in" criteria, we immediately discarded any design that required user or team configuration. We didn't want to allow for the possibility that an admin could disable or remove functionality that was important to any of our users.

As we thought through the problem, we went through many iterations before landing on the design we have today. Below are a few of the iterations and reasons why we rejected them:

Work item type string

All cards on the board now include "Product Backlog Item" and "Bug" to help distinguish work item type. The cards have expanded greatly in size.

This was rejected because it did not meet our recognizable and minimalist criteria by increasing the amount a user must digest to get the type information. We also played around with 2-3 character short hand (PBI for Product Backlog Item) but determined strings were ultimately not as recognizable as an icon and presented additional complexity for TFS localization.

Abstract shape and color

Next to the color bar on the Kanban cards, a small square made up of different shaped triangles distinguishes the Bug and Product Backlog Item types.

This design meets all our criteria and in comparison to the type strings was minimalistic. It was eventually rejected in favor of icons due to the difficulty users had in associating the shape with a type.

Still iterating

No matter how much thought goes into a design, there are always things that are unaccounted for; work item type icons was no exception. Almost immediately after rolling out the feature, we received user voice suggestions, developer community posts, and emails from customers who found the bug icon particularly unsettling. It was likened to "a disgusting cockroach" described by some as "vile and repugnant" and in extreme cases caused users a high level of anxiety due to their own fear of insects.

Card with the old beetle bug

During design reviews of the new icons, we never imagined the impact a 16x16 pixel icon could have on a user's work item tracking experience. In response to feedback, we added two other issue/defect type icons to our catalog as a customization option.

Two Bug cards; One has a broken lightbulb icon while the other has a clipboard with an error exclamation point.

However, we ultimately decided that the best way to alleviate the strong reactions described by our users was to redesign the bug to be more friendly and approachable. After validating with customers both internally and externally, we strongly believe the new ladybug meets those design criteria

Ladybug is used for the bug work item type

Helpful links

Below I've included some useful links to more information about customizing work item type icons as well as Microsoft's inclusive design philosophy.

Feel free to leave your feedback or questions in the comments section below.

Thanks for reading!

 

Deprecating our Folder Management extension

$
0
0

The Folder Management extension was one of the first extensions we published on the marketplace, spear headed by Wouter de Kort.

We are building a ‘create folder’ feature in the product for VSTS and TFS 2018 or higher. Because of this, we’re deprecating this extension for those products. Once support for earlier versions of TFS (2015 and 2017) lapses, we’ll unpublish the extension entirely.

What’s the plan?

Deprecation steps:

  1. Initially we’ll make no changes as we wait for the feature to deploy to all environments. Once deployed, VSTS users will see both the custom extension point and the new product feature to create a new folder in their favourite repository.
  2. Add a deprecation notice on the overview page and remove support for VSTS, limiting support to TFS 2015 and TFS 2017 for new users. For VSTS users the custom Folder Management extension point will disappear from the web application.
  3. Unpublish the extension from the marketplace. The extension will no longer appear on the marketplace home page or in marketplace search results. If you remember the extension URL in the marketplace, you’ll see this notice in place of install and download options:
    image

So, how should you create folders in TFVC repositories in future?

Here’s a quick walk-through of the new feature:

  • Select the location in your TFVC repo where you’d like to create a new folder.
  • Click (More Actions), select New (1), and select Folder (2).
    SNAGHTML15c949f3
    Avoid using the custom Folder Management extension point (3) in future.
  • Enter a new folder name (4), optionally enter a comment (5), optionally select one or more linked work items (6), and click Check in (7).
    SNAGHTML15748ce5
  • Done!
    SNAGHTML15760c80

So, how should you create folders in Git repositories in future?

  • Select the location in your Git repo where you’d like to create a new folder.
  • Click (More Actions), select New (1), and select Folder (2).
    SNAGHTML1fe02197[4]
    Avoid using the custom Folder Management extension point (3) in future.
  • Enter a new folder name (4), optionally creating multiple subfolders, specify a new file name (5), and
    click Create (6).
    SNAGHTML1fe3defd
  • Optionally update the content of the new file and click Commit.
    SNAGHTML1fe84cc7
  • Optionally enter a comment (7), optionally select one or more linked work items (8), and click Commit (8).
    SNAGHTML1fe9d981
  • Done!
    SNAGHTML1febc2d3

What happens to our OSS repo?

Nothing! We’ll maintain the sample code in the Folder-Management-Extension repo indefinitely.

It’s a little sad to see our top performing extension make an exit Sad smile, but it was a great gap filler, and will continue to serve as a great sample for building extensions.
SNAGHTML15787d61

Insider Fast release notes: 15.39 (170829)

$
0
0

This week's Insider Fast update, version 15.39 (170829), was released on 8/30/2017.

For release notes, please refer to our support.office.com article here.

 

WinDbg Preview FAQ

$
0
0

We've had a ton of comments on our last blog post and social media, so here's some of the top questions we've been seeing. We'll be expanding this as more questions are asked and we have better answers.

The store is saying the app is unavailable?

Please comment in the comments below with your location and full OS version. There were some availability issues yesterday, but they should all be resolved.

What's missing?

There's a few larger things that are missing:

  • Command Browser
  • Scratchpad
  • Modules window
  • Event filters window

Besides those, there's a few command-line options, menu options. and settings that are missing. If you want use to prioritize something that you use often, go file or upvote feedback in the Feedback Hub.

Why aren't my workspaces working?

We're in the process of changing how workspaces work in WinDbg Preview. In WinDbg, workspaces save lots of different information that can be difficult to keep track of. We want to make workspaces easier to control and use.

We'd love your feedback on workspaces, so feel free to give us your thoughts in the Feedback Hub.

WinDbg Preview does ____ slightly different than WinDbg does, is that on purpose?

Sometimes, yes. Sometimes, no. If something works differently than you expect, send it to us in the Feedback Hub, and we can take a look at it. Some places we've changed behavior on purpose, as we try to improve the debugging experience, and we want to hear feedback what changes are good and bad. Putting your feedback in the Feedback Hub will help us know if we should change it back.

Where can I find USBView and all the other executables that ship with WinDbg?

The store version only has WinDbg right now. If you want any of the other tools that ship with the debugger package, you can find them in the SDK or WDK.

Is WinDbg going away?

Not yet, but we hope that someday soon WinDbg Preview will be better than WinDbg in almost every way! You can still get WinDbg in the SDK or WDK for the time being. As we finish adding more features to WinDbg Preview to make it more like WinDbg, we'll be looking at replacing WinDbg. So if you have any major problems in WinDbg Preview, make sure your voice is heard by putting them in the Feedback Hub!

Why is it only in the Windows Store?

The store lets us release updates quickly and automatically and helps ensure that everyone giving us feedback is on a recent version. One of the problems with WinDbg today is that people will tell us something is broken when they’re running a version from years ago. With how quickly we’re hoping to fix bugs and add features early on in the preview, we need people to be able to seamlessly get the latest and greatest version. We’ll be looking at adding it to the SDK and WDK as soon as we’re more comfortable with the stability and features.

Free EBooks available for download on Windows Server 2016

$
0
0

Take advantage of IT innovation while reducing security risks and disruptions. With Windows Server 2016, get new layers of security, datacenter efficiency, and agility in application development backed by Microsoft Azure, one of the world’s largest cloud datacenters.

Download the free e-book to learn about the latest technology in Windows Server 2016 and what it means for your organization. Inside you’ll learn how to:

  • Better protect credentials, the operating system, and virtual machines (VMs) with just-in-time administration and shielded VMs
  • Improve datacenter efficiency with virtualization, software-defined storage, and networking
  • Deliver application innovation with improved security, new modernization capabilities, and cloud-native app development

The Ultimate Guide to Windows Server includes an 18-page overview and 180-page deep-dive.

Links:


blogpost_dbfjx

Excel 2016 バージョン 1704 以降で XLL 形式のアドインを組み込んだ場合の問題について

$
0
0

こんにちは、Office 開発サポートチームの遠藤です。

今回の記事では、Office 2016 クイック実行形式 (C2R) 向けのバージョン 1704(16.0.8067.xxxx) 以降の更新よって、 XLL 形式のアドインを組み込んだ際の既知の問題について記載します。

Excel を通常通り起動すると、スタート画面(テンプレート等を選択する画面) が表示されます。
なお、スタート画面自体は、[ファイル] - [オプション] で、 [基本設定] の中の "このアプリケーションの起動時にスタート画面を表示する" という オプションをオフにすることで表示しないように設定することができます。

ただ、XLL 形式のアドインを組み込んでいる場合に、設定に関係なくスタート画面が表示されず、かつ、新規のブックも作成されない状態で Excel 2016 が起動します。

 

この動作は製品の問題と判断しており、現在弊社内で対応方針について検討を行っています。
現時点では、[ファイル] - [新規作成] から、スタート画面を表示してください。

 

本現象について、更新があれば随時この記事で公開します。

今回の投稿は以上です。

 

本情報の内容 (添付文書、リンク先などを含む) は、作成日時点でのものであり、予告なく変更される場合があります。

Windows 10 における CreateDesktop 関数の問題について

$
0
0

 

こんにちは、Platform SDK (Windows SDK) サポートチームです。
今回は、Windows 10 において、CreateDesktop 関数を使用して新しいデスクトップ
を作成した際に確認されている問題についてご案内します。
 
現象
Windows 10 において、CreateDesktop 関数を使用して新しいデスクトップを作成す
るアプリケーションを想定します。
通常、Windows 10 で、Ctrl + Alt + Del キーを押下して画面をロックすると、ロッ
ク画面が表示されます。
 
しかし、CreateDesktop 関数で作成されたデスクトップでCtrl + Alt + Del キーを
押下して画面をロックすると、ロック画面が表示されず、デスクトップが表示された
ままになる現象が確認されています。
この状態が発生すると Ctrl + Alt + Del キー以外の操作を受け付けなくなります。
 
原因
現在調査中です。詳細については、分かり次第更新予定です。
 
回避策
現象発生後は、Ctrl + Alt + Del キーを押下してロック画面を解除します。
 
状況
進展があり次第、更新予定です。

Automating Azure Analysis Service Processing using Azure Automation Account

$
0
0

 

Analysis Services has been progressing day-by-day with new exciting features and there is an ask from the users to automate the Azure Analysis Services Processing. There are few ways which we can automate the processing.

  • Using the conventional way what we have for the On-Prem. Using SQL Agent Job/ Using AMO Objects/ Using PowerShell.

Since we are dealing with Azure, we need to think about automation which wouldn’t be dependent on the On-Prem VM to execute a PowerShell script or any On-Prem SQL Server Instance to run the SQL Server Agent Job.

Also, there are scenarios where we need to deal with the 2-factor authentication where we either get prompted for the phone authentication or need to re-enter the credential while connecting to the Azure Analysis Services.  Now think about a scenario where we are scheduling the job that would run un -attended, there might be a possibility that it prompts for the authentication if the AD token expires while scheduling it with on-prem schedulers. There is different way to tackle that, however we will not discuss this here.

We have an azure automation functionality where we can schedule the PowerShell Script to automate the functionality with the Azure Analysis Services.  Here are the steps we need to follow –

 

Objective: We will create partition for a fact table and process it: TabDemo in my Azure Analysis Services Instance:  asazure://southeastasia.asazure.windows.net/azureasdemo

 

Steps:

1. Creating Azure Automation Account and adding the SQL PowerShell Module

a. Login to http://portal.azure.com

b. Search for “Automation Account

c. Create an automation account.

d. Now you would be able to see the automation account which you just created. The name is “samasutomationaccount”

e. You need to Import the SQLServer PowerShell Modules first.

f. Click on the “Browse Gallery” and search with “SQLServer”.

g. Click on the Import and then OK button at right button corner of your screen.

h. You would be able to see the SQLServer module has been imported in your automation account gallery.

SQLServer Module:

You can download load the SQL Server Module from the link if you want to use it in the on prem: https://www.powershellgallery.com/packages/SqlServer/21.0.17178

Here are the commands you can use for the Analysis Services:

https://docs.microsoft.com/en-us/sql/analysis-services/powershell/analysis-services-powershell-reference

Add-RoleMember cmdlet Add a member to a database role. Add
Backup-ASDatabase cmdlet Backup an Analysis Services database. Database.Backup
Invoke-ASCmd cmdlet Execute a query or script in XMLA or TSML (JSON) format. Execute
Invoke-ProcessASDatabase Process a database. Process
Invoke-ProcessCube cmdlet Process a cube. Process
Invoke-ProcessDimension cmdlet Process a dimension. Process
Invoke-ProcessPartition cmdlet Process a partition. Process
Invoke-ProcessTable cmdlet Process a table in a Tabular model, compatibility model 1200 or higher. Process
Merge-Partition cmdlet Merge a partition. Merge
New-RestoreFolder cmdlet Create a folder to contain a database backup. RestoreFolder
New-RestoreLocation cmdlet Specify one or more remote servers on which to restore the database. RestoreLocation
Remove-RoleMember cmdlet Remove a member from a database role. Remove
Restore-ASDatabase cmdlet Restore a database on a server instance. Restore


2. Creating Credential:

a. We need to define a credential here which we would be using in the Powershell code later.

b. You need to specify the credential which has Admin access in Azure AS Instacne and then click on Create. The name of the credential I created is “SamCred”

 

3. Creating Runbook:

a. Select the Runbook

b. Click on the Add run book

c. Enter the below details:

Choose Powershell as Runbook Type and then click on CREATE

4. Create the Powershell Cmdlet script to automate partition creation and processing.:

a. The main objective code is to automate the creation of the partition for the current month and delete 36th month older partition.

Go to the Runbook we created earlier. Click on Edit

b. Enter the Below Power shell Script:

##Getting the credential which we stored earlier.
$cred = Get-AutomationPSCredential -Name 'SamCred'

## Providing the Server Details
$ServerName = "asazure://southeastasia.asazure.windows.net/azureasdemo"
$DatabaseName = "TABDEMO"
$TableName ="FactInternetSales"

## Getting the current Month and Year
$a= Get-Date
##$startMonth=$a.Month
$startMonth=10
$startYear=$a.Year
if ( $startMonth-eq 12)
{
$endMonth="01"
$endYear=$startYear+1
}
if ( $startMonth -ne 12)
{
$endMonth =$startMonth+1
$endYear=$startYear
}
## Pad 0 at the starting if month is in signle digit
if ( $startMonth -ne 10 -or $startMonth -ne 11 -or $startMonth -ne 12)
{
$startMonth=$startMonth.ToString("00")
}

if ( $endMonth -ne 10 -or $endMonth -ne 11 -or $endMonth -ne 12)
{
$endMonth=$endMonth.ToString("00")
}
$startMonth
$endMonth

##creating the partition for the current month and current year ( You can script out the JSON code from SSMS)

$Query = "{
`"createOrReplace`": {
`"object`": {
`"database`": `"TABDemo`",
`"table`": `"FactInternetSales`",
`"partition`": `"FactInternetSales_"+ $startMonth+$startYear+"`"
},
`"partition`": {
`"name`": `"FactInternetSales_"+$startMonth+$startYear+"`",
`"source`": {
`"query`": [
`"SELECT * FROM [dbo].[FactInternetSales] WHERE ORDERDATEKEY >= N'"+ $startYear+$startMonth+"01"+ "' AND ORDERDATEKEY < N'"+ $endYear+$endMonth+"01" +"'`"
],
`"dataSource`": `"SqlServer localhost AdventureWorksDW2014`"
}
}
}
}
"
#$Query
##Creating the parition

Invoke-ASCmd -Server $ServerName -Credential $cred -Query $Query
##Processing the partition

$PartName= "FactInternetSales_"+$startMonth+$startYear
$PartName
$result=Invoke-ProcessPartition -Server $ServerName -Database $DatabaseName -TableName $TableName -PartitionName $PartName –RefreshType Full -Credential $cred

##Deleting the Old partition

if ( $startMonth-eq 01)
{
$prevMonth="12"
$prevYear=$startYear-2
}
if ( $startMonth -ne 01)
{
$prevMonth=$startMonth-1
$prevYear=$startYear-3
}
if ( $prevMonth -ne 10 -or $prevMonth -ne 11 -or $prevMonth -ne 12)
{
$prevMonth=$prevMonth.ToString("00")
}
$prevMonth

$delQuery="
{
`"delete`": {
`"object`": {
`"database`": `"TABDemo`",
`"table`": `"FactInternetSales`",
`"partition`": `"FactInternetSales_"+$prevMonth + $Prevyear +" `"
}
}
}
"

#$delQuery

Invoke-ASCmd -Server $ServerName -Credential $cred -Query $delQuery

$error[0].Exception.Message
$error[0].Exception.StackTrace

 

c. Click on the Test Pane. And then hit on the start to test.

d. Schedule the runbook.

e. Click on the Add Schedule and enter the details:

 

This is how you would be able to Schedule the Processing.  To know more about azure automation, please refer the link below:

https://docs.microsoft.com/en-us/azure/automation/automation-intro

https://azure.microsoft.com/en-in/pricing/details/automation/

 

 

Author:     Samarendra Panda - Support Engineer, SQL Server BI Developer team, Microsoft

Reviewer:  Kane Conway – Support Escalation Engineer, SQL Server BI Developer team, Microsoft

 

blogpost_dmfpr

Viewing all 29128 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>