Quantcast
Channel: MSDN Blogs
Viewing all 29128 articles
Browse latest View live

CIMOL Goes to Seattle: Perjalanan ke Seattle

$
0
0

Sesuai rencana, tim CIMOL hari ini melakukan perjalanan ke Seattle dari Jakarta. Karena salah satu anggota tim (Tifani) harus mengikuti acara wisuda, maka rombongan kami dipecah menjadi 2 (dua): rombongan pertama terdiri dari Adi, Fery dan bu Ayu (mentor) berangkat tanggal 22 Juli, sedangkan Tifani berangkat tanggal 23 Juli.

Hari ini, rombongan Adi, Fery dan bu Ayu berangkat dengan maskapai penerbangan ANA pukul 06.15 pagi dari Jakarta menuju Tokyo Narita. Semua proses check-in dan imigrasi berjalan dengan lancar.

image

Setelah menjalani penerbangan selama 7,5 jam ke Tokyo Narita, tim CIMOL mendarart pukul 15.50 waktu setempat dan transit selama 2 jam di sana.

image

Setelah itu, tim CIMOL kembali terbang dengan maskapai ANA pada pukul 18.05 waktu setempat.

image

Setelah menjalani penerbangan selama 9 jam, tim CIMOL mendarat dengan selamat di Seattle pada pukul 11.25 waktu setempat, pada tanggal 22 Juli 2017.

Setelah menjalani proses imigrasi dan bea cukai, tim CIMOL langsung meluncur ke Alder Hall di kampus University of Washington dan check-in di sana. Hari ini dijalani dengan makan siang bersama, keliling kota Seattle dan makan malam bersama. Kami dengan sengaja tidak langsung istirahat agar tidak jet-lag dan langsung menyesuaikan dengan waktu setempat.

Sebelum kami beristirahat, kami mendapat kabar bahwa Tifani sudah boarding menuju ke Tokyo Narita dari Jakarta.

image

Semoga perjalanan Tifani juga lancar seperti yang kami alami.

Besok adalah hari pendaftaran bagi semua peserta Imagine Cup 2017 World Finals. Sore harinya juga ada briefing bagi peserta, makan malam, dan foto bersama. Mohon doanya agar semua lancar!


Reset lost admin account password

$
0
0

Symptom:

If you lost your admin account password, or you need to change the password for any reason follow this article to reset your admin account password.

Resolution:

Option 1: Using Azure Portal

  1. Using Azure Portal open your Azure SQL Server blade.
  2. Make sure you are in the Overview blade.
  3. Click on "Reset password" at the top of the overview blade.
  4. Set the new password and click save.

Figure 1 – reset password using Azure Portal.

Option 2 – Using Azure CLI

  1. Open Azure CLI – choose the right option for you
    1. On your workstation (Installation instructions here)
    2. On Azure Portal click the CLI button

      Figure 2 – Azure CLI using the Portal

  2. Run the following command, change the names to match your environment.

    az sql server update --resource-group <ResourceGroupName> --name <Servername> --admin-password <NewAdminAccountPassword>

Figure 3 – Output of the CLI on Azure Portal – Blurred

Option 3 – PowerShell

  1. Make sure you have AzureRM PowerShell module installed (installation instructions here)
  2. Run the following PowerShell cmdlets

    Login-AzureRmAccount

    Set-AzureRmSqlServer -ResourceGroupName <ResourceGroupName> -ServerName <ServerName> -SqlAdministratorPassword (ConvertTo-SecureString "<NewAdminAccountPassword>" -AsPlainText -Force)

Figure 4 – PowerShell output – Blurred

Option 4 – Using T-SQL

This is the not common option – as if you are connected to SQL you have the password to another admin account

  1. Using any client (SSMS / sqlcmd / PowerShell invoke-SQLcmd cmdlet or any other client application)
  2. Connect to the master database
  3. Run the following T-SQL command

ALTER LOGIN <AdminAccountName> WITH Password='NewAdminAccountPassword';

IP Address Mapping in Power BI

$
0
0

Sam Lester Power BI Blog

I recently assisted in troubleshooting an issue where the error logs contained several unknown IP addresses. During this process, I created a quick dashboard in Power BI to display the location of these IP addresses on a map to get a better understanding of where the machines were located. I used a free service from IPInfoDB, which requires registration to obtain your API key, but is very straightforward and worked very well for this project.

The basis of this solution is in calling the free web service that returns JSON, then parsing this through M code to obtain the individual fields (country, latitude, longitude, etc.). Doing this manually through Get Data -> Web and passing the full URL, we see an example of the data returned by the lookup service.

IP Address Lookup

Assuming that your Power BI report contains a column called “IP Address”, the following steps will allow you to create the map of IP address locations.

1. Create a new column that contains the full URL used to lookup each IP address.

= Table.AddColumn(#"Changed Type","FullIPURLCity", each "http://api.ipinfodb.com/v3/ip-city/?key=[URL_Key]&ip="&[IP Address]&"&format=json")

2. Replace the string [URL_Key] with the key obtained during registration (link above).

Power BI Sam Lester

3. Create the lookup function in M (create a blank query, open Advanced Editor, paste the following code, and rename the function as GetAllFromIP).

let
Source = (FullURL) =>
let
Source = Json.Document(Web.Contents(FullURL)),
#"Converted to Table" = Record.ToTable(Source),
#"Transposed Table" = Table.Transpose(#"Converted to Table"),
#"Promoted Headers" = Table.PromoteHeaders(#"Transposed Table", [PromoteAllScalars=true])
in
#"Promoted Headers"
in
Source

4. Click "Close & Apply" to run the lookup function for each of the IP addresses in your report.

IP Address Mapping in Power BI

The sample .pbix file can be downloaded here.

Thanks,
Sam Lester (MSFT)

.NET Framework のコンソール アプリケーションで STA スレッドから XmlSerializer クラスを利用する場合の注意事項

$
0
0

こんにちは、Visual Studio サポート チームです。

今回は、.NET Framework のコンソール アプリケーションで STA 属性を指定したスレッドから、XmlSerializer クラスを利用するなどして直接、または間接的に COM コンポーネントが利用される場合の注意事項についてご案内します。

 

注意事項

STA に属する COM コンポーネントを作成した場合は、COM のガイドラインに則り、対象の STA スレッドでは定期的にメッセージ ポンプを動作させてウィンドウ メッセージを処理する必要があります。メッセージ ポンプの処理を実装していないと、対象 STA の外部からは COM コンポーネントにアクセスすることができません。このため、特に .NET Framework のコンソール アプリケーションでは、ファイナライザー スレッドが対象の STA スレッドと通信できずにハング アップしてしまい、メモリ リークなどの問題を引き起こす可能性があります。

STA スレッドがメッセージ ポンプを実装する必要がある点については以下のドキュメントに解説がありますのでご参照ください。

[OLE] OLE スレッド モデルの概要としくみ
https://support.microsoft.com/ja-jp/help/150777/info-descriptions-and-workings-of-ole-threading-models

 

具体例

.NET Framework の XmlSerializer クラスでは、コンストラクタの特定のオーバーロード (*1) を利用すると、内部でアセンブリを生成してキャッシュする処理が行われます。この処理では内部的に COM を利用しており、STA スレッドから生成すれば STA に、MTA スレッドから生成すれば MTA に属するオブジェクトが生成されます。

 

(*1)  XmlSerializer クラスの特定のコンストラクタのオーバーロードでアセンブリ生成が行われる動作については以下のドキュメントに記載があります。

XmlSerializer Class
https://msdn.microsoft.com/en-us/library/system.xml.serialization.xmlserializer(v=vs.110).aspx
----
Dynamically Generated Assemblies
To increase performance, the XML serialization infrastructure dynamically generates assemblies to serialize and deserialize specified types. The infrastructure finds and reuses those assemblies. This behavior occurs only when using the following constructors:

XmlSerializer.XmlSerializer(Type)
XmlSerializer.XmlSerializer(Type,String)

If you use any of the other constructors, multiple versions of the same assembly are generated and never unloaded, which results in a memory leak and poor performance. The easiest solution is to use one of the previously mentioned two constructors.
----

 

ここで、ある程度長い期間動作するコンソール アプリケーションにおいて、Main メソッドで [STAThread] 属性を指定して XmlSerializer を使用しているケースを想像してください。

XmlSerializer を特定のコンストラクタで生成すると、前述のとおり STA に属する COM オブジェクトが生成されます。このアプリケーションは長期間動作し続けますが、いずれ、ガベージ コレクション (GC) が実行され、GC の機能の一部であるファイナライザー スレッドはファイナライズ可能なオブジェクトのファイナライズ処理を実行します。

このファイナライズ可能なオブジェクトの中には、前述の XmlSerializer がアセンブリのキャッシュに利用する COM オブジェクト (厳密にはそのマネージ ラッパーである RCW オブジェクト) も含まれますが、このオブジェクトは STA で生成された場合、当該 STA スレッド上でのみ動作するよう COM 基盤によって制御されていることから、ファイナライザー スレッドなど STA の外部のスレッドから直接アクセスすることはできません。STA の外部からのアクセスは COM 基盤によって制御され、ウィンドウ メッセージを介して STA スレッドに処理がディスパッチされます。

STA スレッドはファイナライザー スレッドからウィンドウ メッセージで通知される処理要求を受信できる必要がありますが、コンソール アプリケーションなどでメッセージ ポンプを実装していない場合、この処理要求を受け付けることができず、ファイナライザー スレッドは応答を待ち続けてハング アップした状態となります。

結果として、ファイナライズ可能オブジェクトの終了処理が進まないため、メモリ リーク等の問題につながる場合があります。

 

再現コード

上述の現象は、以下のようなサンプル コードで確認することができます。

[STAThread]
static void Main(string[] args)
{

XmlSerializer serializer = new XmlSerializer(typeof(MyClass), "http://www.microsoft.com");
serializer = null;

GC.Collect();
for (;;)
{

System.Threading.Thread.Sleep(100);

}

}

サンプル コードをビルドして実行し、WinDbg などのデバッガを使用してファイナライザー スレッドの状態を確認してみます。
マネージ スレッドの一覧から、ファイナライザー スレッドを確認します。5 番スレッドがファイナライザー スレッドです。

0:009> !sos.threads
ID OSID ThreadOBJ State GC Mode GC Alloc Context Domain Count Apt Exception
0 1 83c 0062f7b0 26020 Preemptive 0259CB5C:00000000 0062aa58 1 STA
5 2 2ef4 0063eb28 2b220 Preemptive 00000000:00000000 0062aa58 0 MTA (Finalizer) 

 

ファイナライザー スレッドの状態を、コール スタックから確認します。
以下では、左端からフレーム番号、ベース ポインター、リターン アドレス、モジュール名!関数名 を表示しています。
関数は下から上に向かってコールされています。
15 番フレームから始まる RCW のクリーンアップ処理の過程で、03 番フレームにみられるような、アパートメントを跨いだ処理の呼び出しを行っており、応答を待っている状態であることがわかります。
# 以下は .NET Framework 4.7 を利用している場合の例ですが、.NET Framework のバージョンによって使用される関数が異なる場合があります。

0:009> ~5k
# ChildEBP RetAddr
00 0463ee10 77362bf3 ntdll!NtWaitForMultipleObjects+0xc
01 0463efa4 770195bb KERNELBASE!WaitForMultipleObjectsEx+0x103
02 0463effc 76feec6d combase!MTAThreadWaitForCall+0xdb
03 (Inline) -------- combase!MTAThreadDispatchCrossApartmentCall+0xaf5
04 (Inline) -------- combase!CSyncClientCall::SwitchAptAndDispatchCall+0xbd4
05 0463f1a8 76fef80b combase!CSyncClientCall::SendReceive2+0xcbd
06 (Inline) -------- combase!SyncClientCallRetryContext::SendReceiveWithRetry+0x29
07 (Inline) -------- combase!CSyncClientCall::SendReceiveInRetryContext+0x29
08 0463f204 76feda65 combase!DefaultSendReceive+0x8b
09 0463f2fc 76f344b5 combase!CSyncClientCall::SendReceive+0x3a5
0a (Inline) -------- combase!CClientChannel::SendReceive+0x7c
0b 0463f328 777067e2 combase!NdrExtpProxySendReceive+0xd5
0c (Inline) -------- RPCRT4!NdrpProxySendReceive+0x21
0d 0463f570 76f35e20 RPCRT4!NdrClientCall2+0x4a2
0e 0463f590 7703120f combase!ObjectStublessClient+0x70
0f 0463f5a0 76f989e1 combase!ObjectStubless+0xf
10 0463f630 76f98a99 combase!CObjectContext::InternalContextCallback+0x1e1
11 0463f684 73bfeff6 combase!CObjectContext::ContextCallback+0x69
12 0463f784 73bff0ca clr!CtxEntry::EnterContext+0x252
13 0463f7bc 73bff10b clr!RCW::EnterContext+0x3a
14 0463f7e0 73bfeed3 clr!RCWCleanupList::ReleaseRCWListInCorrectCtx+0xbc
15 0463f83c 73bfd7f8 clr!RCWCleanupList::CleanupAllWrappers+0x14d
16 0463f88c 73bfdac8 clr!SyncBlockCache::CleanupSyncBlocks+0xd0
17 0463f89c 73bfd7e7 clr!Thread::DoExtraWorkForFinalizer+0x75
18 0463f8cc 73bd1e09 clr!FinalizerThread::FinalizerThreadWorker+0xba
19 0463f8e0 73bd1e73 clr!ManagedThreadBase_DispatchInner+0x71
1a 0463f984 73bd1f40 clr!ManagedThreadBase_DispatchMiddle+0x7e
1b 0463f9e0 73cba825 clr!ManagedThreadBase_DispatchOuter+0x5b
1c (Inline) -------- clr!ManagedThreadBase_NoADTransition+0x2a
1d 0463fa08 73cba8ef clr!ManagedThreadBase::FinalizerBase+0x33
1e 0463fa44 73be5dc1 clr!FinalizerThread::FinalizerThreadStart+0xd4
1f 0463fadc 75a68744 clr!Thread::intermediateThreadProc+0x55
20 0463faf0 779e582d KERNEL32!BaseThreadInitThunk+0x24
21 0463fb38 779e57fd ntdll!__RtlUserThreadStart+0x2f
22 0463fb48 00000000 ntdll!_RtlUserThreadStart+0x1b

 

対応方法

STA スレッドに、外部からの処理要求に応答できるようメッセージ ポンプを実装します。
当該スレッドのループ処理の内部などで、定期的に以下の処理を実行することでメッセージ ポンプを動作させることができます。


System.Threading.Thread.CurrentThread.Join(0);


この他、前述のサンプル コードのように特に STA を指定しなくても問題ない場合は、[STAThread] を指定せずに、既定の MTA スレッドとして利用することでも問題を回避することができます。
なお、Windows Form アプリケーションなど UI を持つものは既定でメッセージ ポンプが組み込まれているため、通常、本稿で取り上げたような問題が起こることはありません。

Microsoft 365 announced at Inspire 2017 | Top 4 things for partners to know

$
0
0

At Inspire 2017, Satya introduced Microsoft 365, which brings together Office 365, Windows 10, and Enterprise Mobility + Security, delivering a complete, intelligent, and secure solution to empower employees, below are the 4 Top things partners in Australia need to know about the announcement.

1. Microsoft 365 provides two commercial offerings to support the needs of the largest enterprise to the smallest business;

  • Microsoft 365 Enterprise is designed for large organisations and integrates Office 365 Enterprise, Windows 10 Enterprise, and Enterprise Mobility + Security to empower employees to be creative and work together, securely.  Microsoft 365 Enterprise replaces Secure Productive Enterprise to double-down on the new customer promise of empowering employees to be creative and work together, securely.
  • Microsoft 365 Business is designed for small-to-medium sized businesses with up to 300 users and integrates Office 365 Business Premium with tailored security and management features from Windows 10 and Enterprise Mobility + Security.  It offers services to empower employees, safeguard the business, and simplify IT management.  Microsoft 365 Business will be available in public preview on August 2.|

2. Microsoft 365 Enterprise is offered in two plans—Microsoft 365 E3 and Microsoft 365 E5. Both are available for purchase on August 1, 2017. Microsoft 365 Business will be available in public preview on August 2, 2017. It will become generally available on a worldwide basis in Spring of 2017.

3. As a part of our commitment to small-to-medium sized customers, we also announced three tailored applications that are coming to Office 365 Business Premium and Microsoft 365 Business. These new applications are rolling out in preview over the next few weeks to Office 365 Business Premium subscribers in the U.S., U.K. and Canada, starting with those in the first release program. General availability outside of these markets has not yet been announced.

  • Microsoft Connections —A simple-to-use email marketing service.
  • Microsoft Listings—An easy way to publish your business information on top sites.
  • Microsoft Invoicing—A new way to create professional invoices and get paid fast.

4. Microsoft 365 represents a significant opportunity for partners to grow your businesses through differentiation of offerings, simplification of sales processes and incremental revenue. For more information please refer to the resources listed below;

So What Is WebAssembly All About?

$
0
0

Every now and then people get excited about a new feature that is being developed as a web standard. One such technology that has been garnering excitement lately is called WebAssembly. Its a new way of running code on the web. Think about it as a new sort of application runtime that is available in the browser. Its a subset of normal JavaScript that can be optimized to run extremely quickly (smaller download, much faster parsing in browsers). The browser can also verify that its safe extremely quickly. The new binary format for delivering that kind of code to the browser is called WebAssembly. Its a low level binary code format that is not meant to be read/written by humans. The idea is that you can compile to this format from some other languages.

The best way to understand such a technology is to see it working. For this demo I will be using emcc to compile c code to WebAssembly. In this example I am creating the fibonacci series for a given positive number.

I used Emscripten which is an open source LLVM-based compiler from C and C++ to JavaScript (C => LLVM => Emscripten => JS). The following command produces the .js and .js.map and the .WASM files:
emcc fibonacci.c -s WASM=1 -o fibonacci.js
Next, I loaded the js file in the browser which produced the following results:
As you saw in the example above, I was able to write my code in C and compile it to run in the browser. Here is the browser support for WebAssembly as of June 2017:
At the time of publishing this post there is work being done on other languages to be compiled to WebAssembly like Rust and Swift. Steve Sanderson also has an experimental project called Blazor that shows .Net being compiled to WebAssembly.

Using Eclipse and Java to build and host a web app on Azure

$
0
0

Guest blog by David Farkas Microsoft Student Partner at the University of Cambridge

clip_image002_thumb

About Me

I have started immersing myself in the creative side of technology just a few years ago. I’ll be returning to Cambridge in the fall, to further my education in Computer Science.

My LinkedIn: https://www.linkedin.com/in/david-farkas-3b00a8a1/

My git: https://github.com/Veraghin

Introduction

Nowadays, the Cloud is increasingly part of the life of all programmers, as most applications are dealing with or running on the internet. Microsoft Azure is one of these Cloud platforms. I’ve spent my first year in Cambridge mostly programming in Java, thus it was a given that I would try and see how Java and Azure could be used together.

The starting point is https://azure.microsoft.com/en-us/develop/java/ , which has a collection of links to detailed tutorials about using Java on Azure. After a bit of research, I decided to build on the guide about creating a basic Azure Web app, using Eclipse.

This tutorial, which can be found at https://docs.microsoft.com/en-us/azure/app-service-web/app-service-web-eclipse-create-hello-world-web-app, gave the backbone to my project.

In this post, I’ll be going over the basics of how to convert an existing Java desktop application to an Applet and my experience on hosting it on Azure.

Project setup

 

I’m assuming you have Eclipse set up already on your computer. You will need the Azure Toolkit for Eclipse installed, which can be found in the Eclipse marketplace. By doing this, you can deploy and manage applications running on azure from the Eclipse IDE itself.

clip_image004_thumb[2]

You’ll also need to make sure you have the Eclipse Java EE Developer Tools installed, also from the Eclipse marketplace, it helps you create web applications in Eclipse.

My code can be downloaded from https://github.com/Veraghin/GameOfLifeApplet

To set things up, create a Dynamic Web Project in Eclipse, then copy the source code to the new Web Content folder. clip_image006_thumb[2]

The process of converting from a desktop application to an Applet

The original project was the implementation of Conway’s Game of Life in Java. Converting an existing Java desktop application which utilizes Java Swing for the GUI is relatively straightforward, the top class, extending JFrame can be changed to extend JPanel, and after removing some functions from the constructor which are not needed for an applet, the JPanel can be included as the content pane of a new skeleton applet class.

In this case, my original top class was GUILife, extending the JFrame swing class, which had to change to JPanel, as the applet will be embedded in a website and it can’t have its own window. Also, the constructor had to change from:

   1: public GUILife(PatternStore ps) throws IOException {
   2:     super("Game of Life");
   3:     mStore=ps;
   4:     setDefaultCloseOperation(EXIT_ON_CLOSE);
   5:     setSize(1024,768);
   6:     setLayout(new BorderLayout());
   7:     add(createPatternsPanel(),BorderLayout.WEST);
   8:     add(createControlPanel(),BorderLayout.SOUTH);
   9:     add(createGamePanel(),BorderLayout.CENTER);
  10: }

 



To just:

   1: public GUILife(PatternStore ps) throws IOException {
   2:     mStore=ps;
   3:     setLayout(new BorderLayout());
   4:     add(createPatternsPanel(),BorderLayout.WEST);
   5:     add(createControlPanel(),BorderLayout.SOUTH);
   6:     add(createGamePanel(),BorderLayout.CENTER);


The super(“Game of Life”) call, which would set the window name of the JFrame is redundant, the html will take care of it along with setting the applet size. The setDefaultCloseOperation is uncalled for, it will be handled by the browser running the application.

The following code is the skeleton applet, into which the former desktop application is inserted as its content panel:

 

   1: package gameOfLife;
   2:     import java.io.IOException;
   3:
   4:     import javax.swing.JApplet;
   5: public class GameOfLifeApplet extends JApplet {
   6:     public void init(){
   7:         try{
   8:                 PatternStore starter = new PatternStore("patterns");
   9:                 GUILife gui = new GUILife(starter);
  10:                 gui.setOpaque(true);
  11:                 setContentPane(gui);
  12:             }catch(IOException e){
  13:                  e.printStackTrace();
  14:         }
  15:     }
  16: }

 



This could be run as an applet on its own, but I set out to embed it in a website. To achieve that, I’ll use the Dynamic Web project in Eclipse, following the tutorial linked at the top.

The Eclipse provided Dynamic Web project template automatically creates an index.jsp file, which is our homepage on the website, this is where the applet is 

   1: <object type="application/x-java-applet"
   2:         classid="clsid:8AD9C840-044E-11D1-B3E9-00805F499D93"
   3:         width="1024" height="768">
   4:         <param name="code" value="gameOfLifeGameOfLifeApplet.class">
   5:         <param name="archive" value="GameOfLife.jar">
   6:         <param name="permissions" value="sandbox" />
   7: </object>


To set things up this far, create a Dynamic Web Project in Eclipse, then copy the source code from GitHub to the new Web Content folder.

Project Deployment

From here, the project can be deployed to Azure straight away. Left clicking on the project name and selecting Azure -> Publish as Azure Web App… takes you to the Azure login screen, and then you can create or select the App service you want to deploy to. The whole process is really straightforward and well documented, it makes deploying straight to Azure an easy process.

When creating a new App service, the Azure portal provides more information, but you can customize all the important parts, such as location, pricing tier, Java and Web container versions straight from Eclipse.

image

The Final Implementation

This is an unsigned application, so getting it to run requires jumping through a few hoops, but it is all done for the sake of security. Starting with Java 7 Update 51, applets that are not signed by a trusted authority are not allowed to run in the browser, but this can be circumvented by adding the applet’s URL to the exception list in the Java Control Panel, which can be done by following this guide to adding an URL to the Exception list: https://java.com/en/download/faq/exception_sitelist.xml

More info on the security impact of applets:

https://java.com/en/download/help/jcp_security.xml

https://docs.oracle.com/javase/tutorial/deployment/applet/security.html

The website itself can be found under the following URL, it works under Internet Explorer:

https://webapp-170717105204-gameoflife.azurewebsites.net/

For testing purposes, to see the applet running you can call “appletviewer index.jsp” in the command line in a folder containing the source files. The appletviewer command is part of the Java SDK.

Some pictures of the final version:

clip_image002 clip_image004

Introducing: Analytical Workspaces in Dynamics 365

$
0
0

***ANNOUNCING*** the General Availability of Analytical Workspaces & Reports in Dynamics 365 for Operations. Built-in Analytical Applications are now available standard as part of the Spring '17 Release. The following article offers insights into the Power BI service integration with direct links to walk-thru guides and Best Practices published by the Dynamics 365 product group.

What's important to know…?

  • VALUE PROP - To get a general overview of the advantages in using the Power BI service to deliver embedded analytics throughout the organization review the article here.
  • PORTFORLIO - Usage details on the collection of Analytical applications delivered as part of the Dynamics 365 for Operations Spring '17 Release (aka v7.2) is available here.
  • AUTHORING - Learn how to use Power BI Desktop to author Analytical solutions in a local development environment using instructions here.
  • CUSTOM SOLUTIONS - To extend the application to include custom solutions, use the developer walk-thru which includes X++ code samples and form control properties available here.

Analytics for the Entire Organization

The following image offers a sneak peak into the built-in visualizations delivered standard as part of the Dynamics 365 for Operations service as of July '17.

Note:  Based on the speed of innovation, this list is subject to change as we continue to deliver advanced analytics directly in the application empowering every level of your organization.

Frequently Asked Questions (FAQ)

Q:   Can I customize the Power BI embedded reports?

A:    Yes, simply install Power BI Desktop onto a 1Box to get started using steps described here.

Q:   Do customers need to purchase a separate Power BI license to use the new embedded analytics?

A:    No, however, a Power BI Pro license is required to connect to Entity Store using Direct Query from PowerBI.com

Q:   Can I perform data mashups using external data in the Embedded Reports?

A:    Not at this time.  Data mashups can be performed

Q:   Can I secure data to only those companies I have access to?

A:    Yes, the single company view prevents users from accessing data from companies they don't have access to.  For more information on securing custom solutions, follow guidance provided here.

Q:   How is currency displayed across multiple companies?

A:    As a system currency. (System administration > Setup > System parameters)

Q:   Can I drill on summary balances back into Dynamics 365?

A:    You are able to drill into the details within a Power BI report. There is limited support for drill down into Dynamics 365.

Q:   What languages are currently supported?

A:    English only however the PBI team has additional planned.

Q:   Can I access Analytical Workspaces & Reports in Local Business Data?

A:    Not at this time.  Systems of Intelligence functions are available for Cloud hosted solutions.


Docker で SQL Server 2017 を走らせよう

$
0
0

Microsoft Japan Data Platform Tech Sales Team

阪本 真悟

 

はじめに

SQL Server 2017 から Windows 環境だけではなく、Linux 環境でも SQL Server のデータベースエンジンが動作するようになりました。これまで本ブログでも裏側のアーキテクチャや、Linux 環境での可用性構成(AlwaysOn 可用性グループ)、SQL Server に包含される ETL ツールである SQL Server Integration Services の Linux 対応についてご紹介してきました。

 

Linux 版 SQL Server は Docker コンテナにも対応しています。 SQL Server のような標準的なワークロードは Docker コンテナを活用することで高いポータビリティや柔軟性の恩恵を十分に得ることが出来ます。今回は Docker コンテナ環境での SQL Server 活用についてご紹介します。

まず Azure 上に Docker コンテナの環境を構築しましょう。 Azure の Marketplace に Docker on Ubuntu Server というインスタンスがありますので、これを使って Docker コンテナ環境を立ち上げることにします。

 

システム要件

まずシステム要件に合わせて VM のサイジングをして下さい。 Docker コンテナ環境のシステム要件は以下の通りです。このブログを書くために VM を作成したのですが、当初デフォルトの「Standard A1 (1core, 1.75GB メモリ)」サイズの VM を起動してしまい、SQL Server のステータスがUPしなくて困った状況に陥りました。システム要件の確認って大事ですね。

・Docker エンジン: 1.8 以降
・ディスクの空き容量:最低 4GB 必要
・メモリーのサイズ:最低 4GB 必要
・SQL on Linux 環境のシステム要件に従う

 

最後に SQL on Linux のシステム要件に従うことと書いていますので、SQL on Linux のシステム要件もチェックしておきます。

・メモリーサイズ: 3.25GB
・ファイルシステム: XFS or EXT4
・ディスクの空き容量: 6GB
・プロセッサースピード: 2GHz
・プロセッサーコア数: 2cores
・プロセッサータイプ: x64 互換のみ

準備

Marketplace では様々なサイズの VM が用意されていますので、上記要件に合わせて今回は DS2_V2 を選択しました。

 

サーバ構築が完了したら PuTTY などの SSH クライアントを使ってサーバにログインして、最初に SQL Server の Docker コンテナを Docker Hub からダウンロードします。以下の Bash コマンドで簡単に実行可能です。

 

docker pull microsoft/mssql-server-linux

 

 

※Azure の Marketplace で作成した環境では特に必要ありませんでしたが、設定によっては sudo コマンドなどで権限を与えてやる必要があるかもしれません。

 

Docker コンテナイメージの入手と実行

次に入手した SQL Server の Docker コンテナイメージを使って、それを走らせてみましょう。以下のコマンドを実行します。

docker run -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=<YourStrong!Passw0rd>' -p 1433:1433
-d microsoft/mssql-server-linux

 

それぞれのパラメータの意味をご説明します。

 

-e 'ACCEPT_EULA=Y': SQL Server イメージを使うにあたって必要な利用許諾 (End-User Licensing Agreement) の回答をします。利用許諾の詳細を確認後、Y でご回答ください。

-e 'SA_PASSWORD=<YourStrong!Passw0rd>': <YourStrong!Passw0rd> の部分はパスワードポリシーに合わせて置き換えてご利用下さい。

-p 1433:1433: サーバの TCP ポート番号(1番目の数字)とコンテナの TCP ポート番号(2番目の数字)を紐づけています。今回はサーバとコンテナで同じ TCP ポート番号を使用しています。

microsoft/mssql-server-linux: SQL Server コンテナのイメージを指定します。バージョンを指定しない場合は常に最新版が使用されます。

 

上記コマンドで SQL Server Docker コンテナを走らせたら、以下のコマンドで確認してみましょう。

docker ps -a

 

改行が入ってしまって少し見にくいのですが、例のように STATUS が "UP" で表示されていれば問題なしです。STATUS が "Exited" で表示されてしまう場合はメモリサイズ、ディスクの空き容量などシステム要件を見直してみてください。特にメモリサイズ 1.75GB の Standard A1 で起動していないか確認してみて下さい(笑)

 

SQL Server on Docker への接続

SQL Server 2017 CTP 2.0 では SQL Server コマンドラインツールがコンテナイメージに含まれていますので使ってみましょう。以下のコマンドを使って Docker コンテナに接続します。

docker exec -it ‘Container ID’ “bash”

 

‘Container ID’ は先程の docker ps コマンドで確認したものを指定してください。

 

コンテナに接続した後は、コンテナイメージにあらかじめ含まれている sqlcmd ツールを使って操作をすることが出来るようになります。この時、sqlcmd コマンドに対してパスは通っていませんので、コマンドはフルパスで以下のように指定する必要があります。

/opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P '<YourPassword>'

 

SQL Server で SELECT 文を発行してデータベースの一覧表示をしてみます。

システム データベースの一覧が表示されることが確認出来ました。終了する場合は「exit」コマンドを入力して下さい。

 

コマンドラインで操作するのではなく、使い慣れた SQL Server Management Studio (SSMS) を使って Docker コンテナ上の SQL Server に接続することも出来ます。そのためには事前に Azure Portal から Endpoint の追加をして 1433 ポートへの通信を許可するように設定しましょう。SSH 通信は初期設定で 許可されているのですが、SSMS の通信はファイアウォールで遮断されてしまうからです。

 

Endpoint の追加後に SSMS を起動して、以下のように入力して Docker コンテナ上の SQL Server に接続します。
サーバ名とパスワードはお手元の環境設定に従って置き換えてご入力下さい。

 

見慣れた画面で管理出来るので安心しますね。

 

複数の SQL Server コンテナ起動

Docker コンテナを活用すると同一サーバ上に、簡単に複数の SQL Server を起動することが出来るようになります。以下のコマンドを使って複数の SQL Server を起動してみます。以下の例ではポート番号 1401 と 1402 に 紐づく 2 つの SQL Server Docker コンテナを起動してみました。

docker run -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=<YourStrong!Passw0rd>' -p 1401:1433
 -d microsoft/mssql-server-linux
docker run -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=<YourStrong!Passw0rd>' -p 1402:1433
 -d microsoft/mssql-server-linux

 

アッという間に複数の SQL Server コンテナが立ち上がりました。利用するときは IP アドレスとポート番号を指定することで個別環境に接続することが出来ます。複数の環境を立ち上げて管理するのは大変な作業になりがちですが、SQL Server on Docker コンテナを活用することでエンタープライズ領域での複数の環境でのデータベース管理が簡単に、かつ柔軟になります。

データの永続化

Docker コンテナの環境ではデータの永続化が課題になります。docker コンテナの開始・停止(docker stop・dcoker start)でデータが消えることはありませんが、コンテナ削除のコマンド(docker rm)を入力するとデータベースも含めたコンテナの全データが消えてしまいます。Docker の Data Volume を使ってデータベースを永続化しましょう。

 

Data Volume を使用する方法

Docker コンテナ上の Data Volume に ホストのディレクトリをマウントさせて起動するようにすると、個別のコンテナを削除したときも Data Volume のデータを残すことが出来ます。

 

次のコマンドで Docker コンテナをスタートさせます。

docker run -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=<YourStrong!Passw0rd>' -p 1433:1433 -v <Data Volume>:/var/opt/mssql
 -d microsoft/mssql-server-linux

上記コマンドの -v 以降に<Data Volume> : と指定されている部分がポイントです。

<Data Volume> を指定しない方法ではコンテナを起動するタイミングでディレクトリを作成し Data Volume としてマウントしますが、コンテナを削除するとそのボリュームも一緒に削除されてしまいます。

<Data Volume> を明示的に指定するとマウントされるディレクトリはホスト上のファイルシステムに作成されるため、コンテナを削除しても影響を受けずデータは残りますし、/var/lib/docker 以下に隔離されているのでホストに干渉することもありません。
以下のコマンドで作成された Data Volume の情報を確認しましょう。作成した <Data Volume> = "mssql_data" が指定のマウントポイントにマウントされていることが確認出来ます。

docker volume ls

docker inspect <Data Volume>

 

 

永続化した Data Volume を削除したい場合は以下のコマンドで削除も可能です。

docker volume rm <Data Volume>

バックアップとリストア

バックアップとリストアは Linux 環境と同様に sqlcmd コマンドで実行することも出来ますし、SSMS で実行することも出来ます。

また以下のコマンドを使って Docker コンテナ環境上のファイルをコンテナ外にコピーすることも出来るのでファイル単位でのバックアップも可能です。

docker cp 'Container ID':/var/opt/mssql/data <host directory>

 

以下の例では Container ID 'c88794bfad57' の SQL Server データベース ファイルを ホストサーバの /tmp 以下にコピーしています。

まとめ

Docker コンテナ環境での SQL Server のセットアップから、運用フェーズで使える機能まで、順を追ってご紹介しました。

エンタープライズ領域でもコンテナ技術を適用する機会はどんどん増えてきています。特に開発環境を複数(大量に)準備しないといけないような場合、SQL Server on Docker を活用いただくことで、開発者とシステム管理者の双方にメリットがある柔軟な対応が可能なシステムを実現することが出来ます。

是非、SQL Server 2017 から提供されるこの新機能をご活用下さい。

 

関連記事

SQL Server on Linux って?(第 1 回目)

SQL Server on Linux って?(第 2 回目)

SQL Server on Linux でも AlwaysOn!

SQL Server Integration Services ( SSIS ) on Linux とは

Skype for Business サーバーのイベントログに エラー 32042 が記録される

$
0
0

Skype for Business サポートチームの松本です。

[Lync Server] イベントログに エラー 32042 が記録されている場合の確認箇所をご紹介させていただきます。

エラー 32042 は、お客様からの問い合わせで、SfB サーバーの管理者が意図しない状況で比較的発生している状況が多いように見受けられるため、こちらで紹介させていただきます。
(ポリシーで配布されていた。ソフトウェアのインストール時に入ってしまう。等)

Skype for Business Server 2015 および、Lync Server 2013 では、サーバー証明書を使用して経路上の通信を TLS 暗号化し、安心してご利用いただけるよう実装がおこなわれています。
そのため、定期的に証明書の状態を確認しています。その際に、[信頼されたルート証明書] にルート証明書以外の証明書があることを確認するとイベントログに エラー 32042 が記録されます。

ログの名前: Lync Server
ソース: LS User Services
イベント ID: 32042
イベント 内容:
無効な HTTPS の証明書を受信しました。

サブジェクト名: <FE サーバー証明書のサブジェクト名> 発行元: <CA 局>
原因: この問題は、HTTPS の証明書の有効期限が切れていたり、証明書が信頼されていない場合に発生する可能性があります。参照のために証明書のシリアル番号が添付されています。
解決策:
リモート サーバーを確認して、証明書が有効であることを確認してください。また、ローカル コンピューターに発行元の完全な証明書チェーンがあることを確認してください。

Event Error 32042

事象
エラー 32042 が記録された状態では次のような動作が事象として現れます。

- フロントエンド サーバーで、フロントエンド サービスが起動しない
- フロントエンド サーバー間の TLS の通信が使用できなくなる

なお、KB で、フロントエンド サービスが起動しない事象が公開されています。

TITLE: Lync Server 2013 Front-End service cannot start in Windows Server 2012
URL: https://support.microsoft.com/ja-jp/help/2795828/lync-server-2013-front-end-service-cannot-start-in-windows-server-2012

 

確認および、対処
フロントエンド サーバーの証明書を確認します。

フロントエンド サーバーに管理者権限のあるユーザーでログインします。
1. 検索で「mmc.exe」と入力し、起動します。
2. コンソール画面が開きましたら、[ファイル] - [スナップインの追加と削除] を開きます。
3. スナップインの追加と削除画面で、利用できるスナップイン 項目から [証明書] を選択し、[追加] をクリックします。
4. 証明書スナップイン画面で、[コンピューター アカウント] を選択し、[次へ]、[完了] の順でクリックします。
5. スナップインの追加と削除画面で、[OK] をクリックします。
6. 左ペインより、[コンソール ルート] - [証明書 (ローカル コンピューター)] - [信頼されたルート証明書] - [証明書] を開きます。
7. 表示されるルート証明書の一覧で、[発行先] と [発行者] がそれぞれ一致していることを確認します。
※ 画像の場合、赤枠の部分が一致していない。

RootCA

以下、一致していない証明書がある場合 (ルート証明書以外がある場合)
8. 現状の状態を画面キャプチャーで取得します。(後で戻すことを考えて念のためです。)
9. 一致していない証明書がある場合、[中間証明書] - [証明書] へドラグアンドドロップで移動します。
10. ルート証明書以外をすべて移動しましたら、FE サーバーを再起動します。
11. 再起動後、フロントエンド サービスが実行中となること、起動時にエラー 32042 が記録されないことを確認します。

※ [信頼されたルート証明書] にルート証明書以外の証明書が置かれる状況が誤った状況となります。
    そのため、本件の対処は、該当の証明書を [信頼されたルート証明書] から移動 (削除) する必要があります。
引き続き、Microsoft Unified Communication 製品をよろしくお願いいたします。

本情報の内容 (添付文書、リンク先などを含む) は、作成日時点でのものであり、予告なく変更される場合があります。

CIMOL Goes to Seattle: Briefing dan Persiapan Akhir

$
0
0

Memasuki hari kedua tim CIMOL di Seattle, proses kompetisi mulai berjalan. Di pagi hari, semua tim sudah bisa registrasi untuk mendapatkan badge, jadwal acara lengkap. Setelah pendaftaran, masing-masing tim mendapat kesempatan untuk berfoto sebagai satu tim, dan juga dilengkap dengan video interview.

image

image

Setelah  melakukan team photo, semua tim menerima briefing dari tim Microsoft, mulai dari teknik-teknik presentasi dan pitching, hingga rincian teknis seperti video switch, countdown timer, dll.

image

Setelah makan malam, semua tim dibawa oleh Microsoft ke Kerry Park, yaitu taman yang memiliki pemandangan ke arah kota Seattle. Selain melakukan foto bersama seluruh peserta Imagine Cup 2017 World Finals, tim CIMOL juga menyempatkan diri berfoto sebagai 1 tim dan bersama tim-tim lain.

image

image

image

image

image


Setelah foto-foto, kami kembali ke asrama di University of Washington. Sebelum beristirahat, tim CIMOL kembali berlatih sebentar untuk mempersiapkan diri untuk Technology Showcase dan babak Quarterfinals.

Besok, di kompetisi hari pertama, di pagi hari, tim CIMOL akan presentasi ke 3 juri di booth, masing-masing selama 10 menit. Jika lolos dari babak penyisihan ini, tim CIMOL akan bertanding di siang harinya dalam Quarterfinals yang akan menampilkan Top 32 teams.

Mohon doanya agar tim CIMOL bisa lolos ke babak Top 32, dan agar presentasi di booth dan di Quarterfinals dapat berjalan dengan mulus. Ikuti live tweet di @MSDevID untuk perkembangan selanjutnya.

Intelligence from social list hashtags can power a more human search

$
0
0

Underneath the layer of factual information and numerical data is a deeper, more personal Internet. The online world comprises vast data and information resources and our search engines are adept at crawling through it and finding answers. But what if a user needs more than just a straightforward answer? What if the user needs insights from others’ personal experiences, opinions, or abstract ideas and philosophies?

Microsoft’s Senior Applied Researcher Manish Gupta recently partnered with Ankan Mullick, Prof. Pawan Goyal, and Prof. Niloy Ganguly from IIT Kharagpur to deploy artificial intelligence and machine learning to help us get more meaningful answers for social queries from the Internet. Here’s a closer look at the findings which were recently published in a white paper.

Seeking the intangible

Today, search engines crawl the Internet and extract answers based on a rigorous analysis of keywords and phrases more quickly and effectively than ever before. Questions such as “What’s the median income in London?”, “How many hours would it take to walk across the Great Wall of China?”, and “In which action movies did Tom Hanks star?” can all get an instantaneous answer.

Search engine algorithms are great at working with fact-based queries and providing structured answers. However, search engines are surprisingly ineffective at answering subjective and personal questions.

Queries based on human experiences and personal opinions are difficult for a standard search engine to comprehend. This means users cannot rely on the algorithm to provide meaningful and helpful answers for questions such as, “How to make small talk with new friends,” “People’s favorite memories from school,” “How does it feel to immigrate to a new country?” or “The songs that defined the 80’s.”

While traditional search engines may struggle with such deeply human queries there are online platforms specifically tailored for personal opinions and conversations - social media. Twitter, specifically, has become a forum for people to create sustained online conversations held together by a common hashtag. Twitter hashtags coupled with the 140 character limit per post streamlines the conversation and centers it on a single theme. These themes tend to be deeply personal and human. With hashtags and social conversations, Twitter provides complementary information compared to the one that can be accessed using traditional search engines. This is precisely why Microsoft researchers picked the platform for this study.

About the study

The purpose of the study was to extract meaningful information from social conversations to answer social list queries. To this end, our researchers collected around 4 million hashtags that were trending between January 2015 and June 2015. Out of these, around 67K multi-word hashtags referring to a conversational and personal theme were extracted, using an SVM (Support Vector Machine) classifier. We call such hashtags “idioms”. Since social list names can be expressed using multiple words, social list hashtags are a subset of idioms. Hence, the first step was to enable a classifier to learn to classify an idiom as a social list hashtag versus one that is not. On identifying social list hashtags, related tweets can be used to extract list items. List items for such social lists can be ranked using various factors like popularity and recency.

Datasets were created based on the length and popularity of the hashtags used as well as specific details extracted from the Twitter profiles of the users who tweeted. The intention was to manually annotate some of the idioms as social list hashtags versus those that are not, and use that data for machine learning. Once the model has been learned, it should be able to extract social list hashtags from a large pool of idioms with significant precision.

The raw dataset included nearly 0.2 billion tweets and close to 85 million URLs. The dataset was pre-processed to segment hashtags and detect parts of speech. Social list hashtag detection from a set of idioms focused on three types of features: linguistic (use of numbers, hashtag length, and vocabulary ratios), search (coverage in top 10 or 20 search results on a popular search engine), and Twitter (duration of popularity on the platform and distribution of co-occurring hashtags). The system was evaluated using a comprehensive 10-fold cross-validation on metrics such as precision, recall, and overall accuracy.

The results

Altogether, the high-recall classifier was able to work through the dataset and uncover around 67,000 idioms. These included deeply personal and human hashtags such as #foreveralone, #awkwardcompanynames, #childhoodfeels, and #africanproblems. These idioms were further condensed into social lists based on particular personal themes.

Factors such as the duration of hashtag popularity, related hashtags, URLs, and related hashtags were used to detect context and classify the social lists accurately.

The recall optimized social list hashtag detection system demonstrated 75% precision and 95.3% recall. As expected, Twitter proved to be a treasure trove of valuable social information and opinions. The paper conclusively demonstrated that relevant social information and opinions can be classified as social lists by a high-recall classifier system. This algorithm forms the basis for a better search engine for social platforms.

To sum up

Social media has helped augment the Internet with a layer of deeply social information, experiences, emotions and opinions. This layer of data can help inform users looking for subjective information and trusted opinions.

Traditional search engines struggle with subjective and opinionated information. The structured, keyword-oriented nature of conventional search doesn’t offer valuable insights based on actual experiences and opinions of others. While a search engine can effortlessly tell users the distance between the Sydney and Auckland, it can’t help users from others’ experience of learning a new language or getting married to a childhood friend.

Researchers at Microsoft worked with the IIT Kharagpur team to develop a system that can scour social networks and detect valuable insights from public conversations. Highly effective and precise, this system could form the basis for a deeper, more meaningful search engine.

 

Small Basic - Graduate with VS2017

Novedades en servicios cognitivos

$
0
0

Microsoft Cognitive Services permite a los desarrolladores impulsar la próxima generación de aplicaciones con la capacidad de ver, escuchar, hablar, entender e interpretar las necesidades utilizando métodos de lenguaje natural.

Hemos anunciado varias actualizaciones de servicio:

  • Estamos lanzando Bing Entity Search API, un nuevo servicio disponible en preview, que facilita a los desarrolladores crear experiencias que aprovechan el poder del gráfico de conocimiento Bing con experiencias contextuales más atractivas. Aprovecha el poder de la web para buscar las entidades más relevantes, como películas, libros, personas famosas y empresas, y proporciona con mayor facilidad detalles y fuentes de información sobre ellos.
  • Presentation Translator, proyecto Microsoft Garage, ya está disponible para su descarga. Proporciona a los presentadores la posibilidad de añadir subtítulos a sus presentaciones en tiempo real, en otro idioma para situaciones multilenguaje. Con el reconocimiento de voz personalizado, los presentadores tienen la opción de personalizar el motor de reconocimiento de voz (inglés o chino) utilizando el vocabulario dentro de las diapositivas y notas de diapositivas para adaptarse a la jerga, términos técnicos, productos, nombres de lugares, etc.

Echa un vistazo a lo que estas nuevas API y servicios pueden hacer por ti.

Lleva información relevante de personas, lugares, cosas y negocios locales a tus aplicaciones con Bing Entity Search API

Bing Entity Search API es una nueva adición en nuestro conjunto ya existente de Microsoft Cognitive Services Search APIs, incluyendo Bing Web Search, Búsqueda de imágenes, Búsqueda de ví­deos, Búsqueda de noticias, Bing Autosuggest y Bing Búsqueda personalizada. Esta API te permite buscar entidades en el gráfico de conocimiento de Bing, recuperar las más relevantes, los detalles principales y las fuentes de información sobre ellas. Esta API también admite la búsqueda de empresas locales en los Estados Unidos. Ayuda a los desarrolladores a crear fácilmente aplicaciones que aprovechan el poder de la web y deleitan a los usuarios con experiencias contextuales más atractivas.

Empecemos

  • Lo primero es obtener una clave de suscripción gratuita en la página web de Cognitive Services.
  • Después de obtener la clave, puedo empezar a enviar consultas de búsqueda de entidades a Bing. Es tan simple como enviar la siguiente consulta:
GET https://api.cognitive.microsoft.com/bing/v7.0/entities?q=mount+rainier HTTP/1.1
Ocp-Apim-Subscription-Key: 123456789ABCDE
X-Search-ClientIP: 999.999.999.999
X-Search-Location: lat:47.60357;long:-122.3295;re:100
Host: api.cognitive.microsoft.com

La solicitud debe especificar el parámetro query q, que contiene el término de búsqueda del usuario y el encabezado Ocp-Apim-Subscription-Key. Para consultas de ubicación, como restaurantes cerca de mí, es importante incluir también los encabezados X-Search-Location y X-MSEdge-ClientIP.

Para obtener más información sobre cómo empezar, consulte la página de documentación Realización de la solicitud de las primeras entidades.

La respuesta

El siguiente código muestra la respuesta de la consulta Mount Rainier:

{
    "_type" : "quot;SearchResponse",
    "queryContext" : {
        "originalQuery" : "mount rainier"
    },
    "entities" : {
        "queryScenario" : "DominantEntity",
        "value" : [{
            "contractualRules" : [{
                "_type" : "ContractualRules/LicenseAttribution",
                "targetPropertyName" : "description",
                "mustBeCloseToContent" : true,
                "license" : {
                    "name" : "CC-BY-SA",
                    "url" : "http://creativecommons.org/licenses/by-sa/3.0/"
                },
                "licenseNotice" : "Text under CC-BY-SA license"
            },
            {
                "_type" : "ContractualRules/LinkAttribution",
                "targetPropertyName" : "description",
                "mustBeCloseToContent" : true,
                "text" : "en.wikipedia.org",
                "url" : "http://en.wikipedia.org/wiki/Mount_Rainier"
            },
            {
                "_type" : "ContractualRules/MediaAttribution",
                "targetPropertyName" : "image",
                "mustBeCloseToContent" : true,
                "url" : "http://en.wikipedia.org/wiki/Mount_Rainier"
            }],
            "webSearchUrl" : "https://www.bing.com/search?q=Mount%20Rainier...",
            "name" : "Mount Rainier",
            "image" : {
                "name" : "Mount Rainier",
                "thumbnailUrl" : "https://www.bing.com/th?id=A21890c0e1f...",
                "provider" : [{
                    "_type" : "Organization",
                    "url" : "http://en.wikipedia.org/wiki/Mount_Rainier"
                }],
                "hostPageUrl" : "http://upload.wikimedia.org/wikipedia...",
                "width" : 110,
                "height" : 110
            },
            "description" : "Mount Rainier, Mount Tacoma, or Mount Tahoma is the highest...",
            "entityPresentationInfo" : {
                "entityScenario" : "DominantEntity",
                "entityTypeHints" : ["Attraction"],
                "entityTypeDisplayHint" : "Mountain"
            },
            "bingId" : "9ae3e6ca-81ea-6fa1-ffa0-42e1d78906"
        }]
    }
}

Para obtener más información sobre el consumo de la respuesta, consulte la página de documentación Buscando entidades y lugares en la Web.

Pruébalo ahora

No dudes en probarlo en Entity Search API Testing Console.

 

Crear experiencias de usuario más naturales con los gestos - Proyecto Praga

Project Prague es un SDK novedoso y fácil de usar, que ayuda a los desarrolladores y a los diseñadores de UX a incorporar controles basados en el gesto en sus aplicaciones.Te permite definir e implementar rápidamente gestos de manos personalizados, creando una experiencia de usuario más natural.
El SDK te permite definir tus poses de manos deseadas, utilizando simples restricciones construidas con lenguaje sencillo. Una vez que un gesto se define y se registra en su código, recibirá una notificación cuando hace el gesto, y puede seleccionar una acción para asignar en respuesta.
Utilizando Project Prague, puedes habilitar a tus usuarios para controlar intuitivamente videos, marcar páginas web, reproducir música, enviar emojis o llamar a un asistente digital.

Digamos que quiero crear un nuevo gesto para controlar mi aplicación "RotateRight" .En primer lugar, tengo que asegurarme de que tengo los requisitos de hardware y software. Por favor, consulta la sección de requisitos para obtener más información. Intuitivamente, al realizar el "RotateRight", un usuario esperarí­a que algún objeto en la aplicación de primer plano sea girado a 90 °. Hemos utilizado este gesto para activar la rotación de una imagen en una presentación de PowerPoint en el video de arriba.

El código siguiente muestra una forma de definir el gesto "RotateRight":

 

var rotateSet = new HandPose("RotateSet", new FingerPose(new[] { Finger.Thumb, Finger.Index }, FingerFlexion.Open, PoseDirection.Forward),
                                          new FingertipPlacementRelation(Finger.Index, RelativePlacement.Above, Finger.Thumb),
                                          new FingertipDistanceRelation(Finger.Index, RelativeDistance.NotTouching, Finger.Thumb));

var rotateGo = new HandPose("RotateGo", new FingerPose(new[] { Finger.Thumb, Finger.Index }, FingerFlexion.Open, PoseDirection.Forward),
                                        new FingertipPlacementRelation(Finger.Index, RelativePlacement.Right, Finger.Thumb),
                                        new FingertipDistanceRelation(Finger.Index, RelativeDistance.NotTouching, Finger.Thumb));

var rotateRight = new Gesture("RotateRight", rotateSet, rotateGo);

El gesto "RotateRight" es una secuencia de poses con las dos manos, "RotateSet" y "RotateGo". Ambas poses requieren que el pulgar y el í­ndice estén abiertos, apuntando hacia adelante y sin tocarse. La diferencia entre las poses es que "RotateSet" especifica que el dedo í­ndice debe estar por encima del pulgar y "RotateGo" especifica que debe estar a la derecha del pulgar. La transición entre "RotateSet" y "RotateRight", por lo tanto, corresponde a una rotación de la mano a la derecha.

Ten en cuenta que los dedos medio, anular y meñique no participan en la definición del gesto "RotateRight". Esto tiene sentido porque no deseamos restringir el estado de estos dedos de ninguna manera. En otras palabras, estos dedos son libres de asumir cualquier postura durante la ejecución del gesto "RotateRight".

Una vez definido el gesto, necesito conectar el evento que indica la detección de gestos al controlador apropiado en la aplicación de destino:

rotateRight.Triggered += (sender, args) => { /* This is called when the user performs the "RotateRight" gesture */ };

La detección en sí,­ se realiza en el proceso Microsoft.Gestures.Service.exe. Éste es el proceso asociado con la ventana "Microsoft Gestures Service" de la que se ha hablado anteriormente. Este proceso se ejecuta en segundo plano y actúa como un servicio para la detección de gestos. Necesitaré crear una instancia GesturesServiceEndpoint para poder comunicarme con este servicio. El fragmento de código siguiente crea una instancia de GesturesServiceEndpoint y registra el gesto "RotateRight" para la detección:

var gesturesService = GesturesServiceEndpointFactory.Create();
await gesturesService.ConnectAsync();
await gesturesService.RegisterGesture(rotateRight);
When you wish to stop the detection of the "RotateRight" gesture, you can unregister it as follows:+
C#Copy
await gesturesService.UnregisterGesture(rotateRight);

El handler ya no se activará cuando el usuario ejecute el gesto "RotateRight". Cuando termines de trabajar con gestos, ten en cuenta que debes disponer del objeto GesturesServiceEndpoint:

gesturesService?.Dispose();

Además, para que el código anterior se compile, tendrá que hacer referencia a los siguientes ensamblados, ubicados en el directorio indicado por la variable de entorno MicrosoftGesturesInstallDir:

  • Microsoft.Gestures.dll
  • Microsoft.Gestures.Endpoint.dll
  • Microsoft.Gestures.Protocol.dll

Para obtener más información acerca de la guía de introducción, consulta la documentación.

Daniel Ortiz López
Technical Evangelist Intern
@ortizlopez91

Pozvánka: SQL Server Bootcamp 2017

$
0
0

Zajímá vás, jaké novinky přináší SQL Server 2017 a jak je můžete prakticky využít ve vlastních aplikacích? Komunita WUG pro vás připravila dvoudenní bezplatnou vzdělávací konferenci určenou pro databázové vývojáře, administrátory i BI specialisty, kde se na prakticky orientovaných přednáškách předních českých odborníků na SQL Server seznámíte nejen s novinkami v SQL Serveru 2017, ale i s best-practices z různých oblastí SQL Serveru.

Připravuje se přes 20 přednášek, které budou probíhat ve 2 souběžných tracích po oba dny. Program této konference bude zveřejněn na konci července, aktuálně můžete navrhovat témata, která máme zařadit do programu.

Kde? Fakulta informatiky Masarykovy univerzity (A318), Botanická 68a, Brno

Kdy? 15. 8. 2017-16. 8. 2017

Registrace


Microsoft Azure Enables NIST CSF Compliance: Detect Function

$
0
0

Today, as part of our ongoing support of the Cybersecurity Executive Order, I am pleased to announce the third release in a series of documents on enabling compliance with the NIST Cybersecurity Framework (CSF) through Microsoft Azure services. This release specifically outlines how to implement the Detect function requirements using the services offered by Azure. The whitepaper is available on the Service Trust Portal under "Compliance Guides." We will publish a similar guide for the remaining CSF Function areas in the coming weeks.

Microsoft is committed to assisting our Federal customers, who must comply with the Presidential Executive Order on Strengthening the Cybersecurity of Federal Networks and Critical Infrastructure. We are continuing to develop tools and resources to help with both addressing the core risks outlined in the order, and implementing the NIST Cybersecurity Framework (CSF) as the order requires.

Check out http://aka.ms/cybersecurityeo to find our consolidated blogs, whitepapers, videos, risk assessment templates, compliance automation software, and schedule of events related to the order. Check back weekly for new content throughout the Executive Order reporting period.

As we continue this series, your feedback will allow us to create more specialized guidance and documentation. E-mail AzureBlueprint@Microsoft.com with all feedback on our NIST CSF Guidance.

How can I find out how many threads are active in the CLR thread pool?

$
0
0


A customer was looking for a way to determine programmatically
how many threads are active in the CLR thread pool.



There is no method that returns this information directly,
but you can

snap two blocks together



int max, max2;
ThreadPool.GetMaxThreads(out max, out max2);
int available, available2;
ThreadPool.GetAvailableThreads(out available, out available2);
int running = max - available;


But even though we answered the question,
we don't know what the customer's problem is.
The customer was kind enough to explain:



We have an issue where we exhaust the thread pool,
causing our latency to skyrocket.
We are investigating possible mitigations,
and
knowing when we are close to saturating the thread pool
would tell us when we need to take more drastic measures.
The thread pool threads are not CPU-bound;
they are blocked on SQL queries.
We have a long-term plan to use async/await,
but that is a large change to our code base that will take time
to implement,
so we're looking for short-term mitigations to buy ourselves
some time.


A colleague pointed out that if your thread pool threads
are all blocked on SQL queries against the same server,
then adding more threads won't help because the bottleneck
is not the thread pool.
The bottleneck is the SQL server.
Any new thread pool threads you add will eventually block
on SQL queries to the same unresponsive server.



Now,
if your workload consists entirely of work items that access
the database,
then the database is your bottleneck,
and there's not much you can do on the client to make it go faster.
But
if your workload is a mix of work items that access the database
and work items that don't access the database,
then you at least don't want the non-database work items to be blocked
behind database work items.



If this were a Win32 application,
you could create a second thread pool and queue
database work items to that thread pool.
Non-database work items go to the default thread pool.
When the second thread pool runs out of threads,
it stalls the processing of other database work items,
but the non-database work items are not affected
because they are running on a different thread pool.



But the CLR doesn't let you create a second thread pool,
so your database work items and
non-database work items have to learn to live in harmony.



Rewriting the code to be "async all the way
down"
may not be practical in the short term,
but you could make it async at the top.
Suppose your database work item looks like this:



ThreadPool.QueueUserWorkItem(() =>
{
DoDatabaseStuff(x, y, z);
MoreDatabaseStuff(1, 2, 3);
});


Add a single async at the top:



ThreadPool.QueueUserWorkItem(async () =>
{
using (await AccessToken.AcquireAsync()) {
DoDatabaseStuff(x, y, z);
MoreDatabaseStuff(1, 2, 3);
}
});


The purpose of the
Access­Token
class
is to control how many threads are doing database stuff.
We put it in a using so that it will be
Disposed when control exits the block.
This ensures that we don't leak tokens.



Since the Acquire­Async method is async,
it means that work items do not consume a thread while they
are waiting for a token.
By controlling the number of tokens,
you can control how many thread pool threads are doing database
work.
In particular, you can make sure that database work items
don't monopolize the thread pool threads,
leaving enough thread pool threads for your non-database
work items.



¹

Maoni Stephens

pointed out that there's also
a managed debugging library called

ClrMD

which gives you a lot of information about the thread pool.
You may want to start with the

ClrThread

class.

Scaling Scrum with Nexus in VSTS

$
0
0

This post is from Premier Developer consultant Assaf Stone.


In this post, I will cover what Scrum Nexus is, where, when, and why you would want to use it, and how to set up VSTS best to accommodate your Scrum practices. As VSTS’s tooling is not yet perfect for some of Nexus’s practices, I will discuss some viable fallbacks and workarounds.

Background – Why Scale?

Scrum.org’s official Scrum guide defines Scrum as “A process framework used to manage complex product development”. Scrum’s events, artifacts and rules revolve around a Scrum team, which consists of a Scrum Master, Product owner, and 3-9 development team members (a.k.a. developers).

This limit on team size is important. For Scrum to succeed, the team must consist of developers who can cover all the work required for the product to be delivered at the requisite quality level. If, however, there are fewer than 3 developers, the team is likely to miss some skills required to deliver the product. More than 9 team members, will require too great an effort to coordinate, and result in an unmanageable process.

Therefore, when delivering a large product, as is often the case in enterprise-level projects, the organization must scale beyond the single Scrum team. A framework is required to manage and coordinate the work of multiple Scrum teams.

Enter Nexus.

Continue reading on Assaf’s blog here.


Networking Related Commands for Azure App Services

$
0
0

The purpose of this blog is to give a general overview of the available commands to troubleshoot network connectivity issues with web apps, specifically when connecting the web apps to VNETs either in an App Service Environment (ASE) or a standard web app with a Point-to-Site VPN connection. These commands can be used via the web app’s Console available in the Azure Portal or the Kudu console.

I assume the following in this blog:
1. You are familiar with the terms and use cases of ping, DNS, VNETs, and nslookup.
2. You are familiar with Kudu or the Console blade in the Azure portal.

  • Tcpping.exe : This command is similar to ping or psping where you can test if a web app can reach an endpoint via a hostname or IP address or port. If a web app cannot reach an endpoint via hostname it’s always a good idea to test the IP address that corresponds to the hostname in case, there is an issue with the DNS lookup. Tcpping will always default to port 80 unless another port is specified, ie “<hostname or IP address>:port”. For more information about the command and additional switches, type tcpping in the console. Note that the -t and -n switches are best used in Kudu.

    Examples:
    tcpping google.com
    tcpping google.com:80
    tcpping 10.0.0.5:443
    tcpping google.com:443 -t

  • Nameresolver.exe: This command is similar to nslookup where it will do a DNS lookup against the DNS server that is configured for the web app. By default, a standard app service will use Azure DNS. If the app services is configured with VNET integration, this includes both ASE types as well, it will use your custom DNS servers configured for the VNET. To specify a different DNS server to complete the lookup on, add the IP address of the server after the hostname separated by a space, ie “hostname <DNS Server IP>”.

    Examples:
    nameresolver google.com
    nameresolver google.com 8.8.8.8

  • SET WEBSITE_DNS_ : This command will output the current DNS server that is being used by the web app. If the error Environment variable WEBSITE_DNS_ not defined is received, no custom DNS servers are configured for the web app.

 

For more in-depth troubleshooting steps refer to this article: https://docs.microsoft.com/en-us/azure/app-service-web/web-sites-integrate-with-vnet#troubleshooting

Get Started with F# as a C# developer

$
0
0

Get Started with F# as a C# developer

One of our previous posts, Why You Should Use F#, listed a few reasons why F# is worth trying out today. In this post, we'll cover some of the basics you need to know to be successful. This post is intended for people who are coming from a C#, Java, or other object-oriented background. The concepts covered here should seem very familiar to existing F# programmers.

This post won't make any attempt to show you how to "translate" C# code into F#. This is because C# and F# represent different programming paradigms, which makes each suited to its own purpose. We think you'll find that it expands your mind and helps you be a better programmer to learn functional programming concepts. That value comes to you the fastest if you don't think about trying to translate from one paradigm to another.

Now is the time to be curious, inquisitive, and ready to learn brand-new things. Let's get going!

Immediate differences

Before we dive into concepts, let's look at a small snippet of F# code and see a few areas where F# differs from C#. Here is some basic F# code with two functions and a printed result:

Note that there are no type definitions, semicolons, or braces. The only parentheses are used to call sumOfSquares with 5 as input, printing its result. The pipeline operator (|>) is used much like a unix-style pipe. square is a function passed directly as a parameter to the List.map function (this is known as first-class functions).

Although there are many more differences we could talk about, there are deeper things going on here, and they are key to understanding F#.

Mapping core C# concepts to core F# concepts

The following table provides a basic mapping of some of the core concepts from C# to F#. It's intentionally short and non-exhaustive so that it's easy to remember as you begin to learn F#.

C# and Object-Oriented Programming F# and Functional Programming
Variables Immutable values
Statements Expressions
Objects with Methods Types and functions

Here's a quick primer on some this terminology:

  • Variables are values which can change in-place, or vary. It's in the name!
  • Immutable values are values which cannot change after they're assigned.
  • Statements are units of work, performed imperatively, by a running program.
  • Expressions are units of code which evaluate to a value.
  • Types are classifications of data in a program.

It's worth noting that everything in the C# column is also possible in F# (and quite easy to accomplish). There are also things in the F# column which are possible in C#, though they're more difficult to accomplish. It's also worth noting that items in the left column are not "bad" for F#, either. Objects with methods are perfectly valid to use in F#, and can often be the best approach for F# depending on your scenario.

Immutable values instead of variables

One of the most transformative concepts in functional programming is immutability. It's often underrated in the functional programming community. But if you've never used a language where immutability is the default behavior, it's often the first and biggest hump to get over. Nearly all functional programming languages have immutability at their core.

In the previous statement, the value of 1 is bound to the name x. x now always refers to the value 1 for its lifetime, and cannot be modified. For example, the following code does not reassign the value of x:

Instead, the second line is an equality comparison to see if x is equal to x + 1. Although there is a way to mutate x by making it mutable and using the <- operator (see Mutable Variables for more), you'll quickly find that it's easier to think about how to solve problems without reassigning values. This allows you to play to the strengths of F#, rather than treat it as another imperative programming language.

We said that immutability was transformative, and that means that there are some very concrete differences in approaches to solving a problem. For example, for loops and other basic imperative programming operations are not typically used in F#.

As a more concrete example, say you wish to compute the squares of an input list of numbers. Here is an approach to that in F#:

Notice that there isn't a for loop to be seen. At a conceptual level, this is very different from imperative code. We're not squaring each item in the list. We are mapping the square function over the input list to product a list of squared values. This distinction is subtle in concept, but in practice it can lead to dramatically different code. For starters, getSquares actually produces a whole other list.

Immutability does more than just change the way you manipulate data in lists. The concept of Referential Transparency will come naturally in F#, and it is a driver in how systems are built and pieces of that system are composed. Execution characteristics of a system become more predictable, because values cannot change when you didn't anticipate them to change.

Furthermore, when values are immutable, concurrent programming becomes simpler. Because values cannot be changed due to immutability, some of the more difficult concurrency problems you can encounter in C# are not a concern in F#. Although the use of F# does not magically solve all concurrency problems, it can make things easier.

Expressions instead of statements

As mentioned earlier, F# makes use of expressions. This is in contrast with C#, which uses statements for nearly everything. The difference between the two can initially seem subtle, but there is one thing to always keep in mind: an expression produces a value. Statements do not.

In the previous code sample, you can see a few things which are very different from imperative languages like C#:

  • if...then...else is an expression, not a statement.
  • Each branch of the if expression produces a value, which in this case is the return value of the getMessage function.
  • Each invocation of getMessage is an expression which takes a string and produces a string.

Although this is very different from C#, you'll most likely find that it feels natural when writing code in F#.

Diving a bit deeper, F# actually uses expressions to model statements. These return the unit type. unit is roughly analogous to void in C#:

In the previous sample for expression, everything is of type unit. Unit expressions are expressions which return no value.

F# arrays, lists, and sequences

The previous code samples have used F# arrays and lists. This section explains them a bit more.

F# comes with a few collection types, and the most commonly used ones are arrays, lists, and sequences.

  • F# arrays are .NET arrays. They are mutable, which means that their values can be changed in-place. They are evaluated eagerly.
  • F# lists are immutable singly-linked lists. They can be used to form list patterns with F# pattern matching. They are evaluated eagerly.
  • F# sequences are immutable IEnumerable<T>s under the covers. They are evaluated lazily.

F# arrays, lists, and sequences also have array, list, and sequence expression syntax. This is very convenient for different scenarios where you can generate one of these collections programmatically.

Mapping common LINQ methods to F# functions

If you are familiar with LINQ methods, the following table should help you understand analogous functions in F#.

LINQ F# function
Where filter
Select map
GroupBy groupBy
SelectMany collect
Aggregate fold or reduce
Sum sum

You'll also notice that the same set of functions exist for the Seq module, List module, and Array module. The Seq module functions can be used on F# sequences, lists, or arrays. The array and list functions can only be used on F# arrays and F# lists, respectively. Additionally, F# sequences are lazy, whereas F# lists and arrays use eager evaluation. Using Seq functions on an F# list or F# array will incur lazy evaluation, and the type will then be an F# sequence.

Although the previous paragraph may be a lot to unpack, it should feel intuitive as you write more F#.

Functional pipelines

You may have noticed the |> used in previous code samples. This operator is very similar to unix pipes: it takes something on the left-hand side, and makes that the input to something on the right. This operator (called "pipe", or "pipeline") is used to form a functional pipeline. Here's an example:

Walking through this code sample, items is passed as input to the Seq.filer function. The output of Seq.filter, a sequence, is then passed as input to the Seq.map function. The output of Seq.map is the output of getOddSquares.

Use of the pipeline operator is so much fun to use that it's rare to see F# code which doesn't make use of it. It's usually near the top of everyone's favorite F# feature list!

F# Types

Because F# is a .NET language, it shares the same primitive types that C# does: string, int, etc. It also has .NET objects, and supports the four main pillars of object-oriented programming. It also has tuples. F# also has two primary types not found in C#: Records and Discriminated Unions.

A Record is a named, ordered grouping of values which have equality baked in. And by equality, we mean in the most literal sense. There is no need to distinguish between reference equality or some custom definition of value equality between two objects. Records are values, and values have equality. They are Product Types for the category theorists out there. They have a number of uses, but one of the most obvious ones is a replacement for POCOs or POJOs.

The other foundational F# type is a Discriminated Union, or DU. DUs are a type which could be one of a number of named cases. These are Sum Types for the category theorists out there. They can also be recursively-defined, which dramatically simplifies hierarchical data.

Ta-da! Armed with the power of Discriminated Unions and F#, you can pass any programming interview which requires you to flip a binary search tree.

You may have noticed a bit of funky syntax in the Node case of the tree definition. This is actually a type signature for a tuple. That means that a BST, as we've defined it, can either be empty, or a tuple of (value, left subtree, right subtree). Read more about Signatures to learn more.

Putting it all together: F# Syntax in 60 seconds

The following snippet of code is presented with permission from Scott Wlaschin, an F# community hero who wrote the following great overview of F# syntax. You should be able to read through in about a minute. It has been edited slightly.

Additionally, there is the Tour of F# document in our official documentation for .NET and its languages.

What you can do next

This post covered a lot of things, but it only began to scratch the surface of F#. We hope that after reading this post, you'll be able to dive further into F# and functional programming. Here are a few things we recommend building as a way to learn F# even further:

There are many, many more things you can use F# for, so the previous list is by no means exhaustive. F# is used from things as simple as a build script to forming the backend of a a billion-dollar eCommerce site. There is no shortage of projects you can use F# for.

Additional resources

It's worth noting that there is also a wealth of information about learning F#, including from a C# or Java background. The following links will be helpful as you dive deeper into learning F#:

There are also multiple documented ways to get started with F#.

Finally, the F# community is incredibly welcoming for beginners. There is a very active slack run by the F# Software Foundation, with rooms for beginners, which you can access by joining for free. We highly encourage you to do so!

Viewing all 29128 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>